var/home/core/zuul-output/0000755000175000017500000000000015136000320014513 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015136005104015464 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000250751615136004731020266 0ustar corecore xikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD pR~i.߷;U/;?FެxۻfW޾n^C4%_̿f\ϘקjzuQ6/㴻|]=ry+/vWŊ7 .=*EbqZnx.h{nۯSa ׋D*%(Ϗ_϶ݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;FmZl8T*v (6pk**+ Le*gUWi [ӊg*XCF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X*ǘ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߾)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~* ^g/5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-,d~+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%ה_' c9*%WyΈ W\Of[=o̕22G w,D"WhYkoն a.1wXhxDI:;.^m9W_c.4Q IJipqc2*;Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6j< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBpuFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȇkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0= g`_w\|8Fjȡstuf%Plx3nvOσ =?6ͪ)Bppًu_wm/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> >X1 smD) ̙T~T,Vv{mxY}SRL-by-a3&(!F)ϋ]8ac#sĢB\PIPfwJQJ;Qxm &WBf\ZA$Ba-z|@-I a%} }CyjfJGg E6UY"6ٚacO"dW>"J~7+0_t[%XU͍ &dtO:odtRWon%*m~(fnc.^xt4gD638L"!}LpInj2ɘCGOa9C1L PU:LNTPlI&N:oճM\Qe%*?vQ~W  yr3-2+=Щp!k2wu_~c9'\ॣwx"k%oTͯ܈'i1Jh`(D"y@ "0#7=OP^b5K 0Bt&n2hev/nw 'hEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8Ǘ{]oOtsϑ`94t1!F PI;i`ޮMLX7sTGP7^s08p15w q o(uLYQB_dWoc0a#K1P,8.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} 3V6UݎvxyRC%ƚq5Щۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+weaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{#ɠ^^f3Nd0 ~n~YͤBoK&9<{̻*RmသLΕbDOJx߭&~+WrVXӼSZEY|RyZc]/mm}àpGg.S[@AeE{0մ{Ÿ&^Ut (6{\٢K 5X\*wٗYS%g,0\ Rk k8P>x7v21՚H :[Γd!E'a4n?k[A׈(sob 41Y9(^SE@7`KIK`kx& V`X0,%pe_ן >hd xе"Q4SUwy x<'o_~#_?`i3GM"mZ-p)v‡~#mo39YBaZo@/xi*@ t-ö]um_.+^ Sɕle$yM`. #N8أzgۏjk_c51̃^cn ba-X/#=Im41NLu\9ETp^poAOO&A+mj(^>c/"ɭex^k$# $V :]PGszy".zɪ) ӓT)D:fci[*`cc&VhfFp佬)/Wdځ+ uR<$}Kr'ݔTW$md1"#mC_@:m P>DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;E4z⛢79"hK{BFEmBAΛ3>IO j u߿d{=t-n3Pnef9[}=%G*9sX,¬xS&9'E&"/"ncx}"mV5tŘ:wcZ К G)]$mbXE ^ǽ8%>,0FЕ 6vAVKVCjrD25#Lrv?33Iam:xy`|Q'eű^\ơ' .gygSAixپ im41;P^azl5|JE2z=.wcMԧ ax& =`|#HQ*lS<.U׻`>ajϿ '!9MHK:9#s,jV剤C:LIeHJ"M8P,$N;a-zݸJWc :.<sR6 լ$gu4M*B(A ݖΑِ %H;S*ڳJt>$M!^*n3qESfU, Iĭb#UFJPvBgZvn aE5}~2E|=D' ܇q>8[¿yp/9Om/5|k \6xH.Z'OeCD@cq:Y~<1LٖY9# xe8g IKTQ:+Xg:*}.<M{ZH[^>m0G{ ̷hiOO|9Y"mma[sSbb'Rv&{@6; KE.a\}:<]Oyve3h9}E[kMD,5 %sO{킒 8.K?]i/`׎tp NvԻV4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ xg{z%ǻViX~鮦w35QE~qp[ʕ@}ZL! Z0!A⼏q)[f &E1K3i+`JG P/EG 4 9LڑKL|`PОnG#|}qOR{Q|2_tH߫%pD?1%(@nfxOrs25rMլf{sk7݇fjӞh2HkeL'Wʿ}Ƞ%>9cSH|cEyQp 'ˢd:,v-us"Iidw>%zM@9IqrGq:&_p3õB!>9'0LL]M[lwWVR9I5YpVgtuZfG{RoZr3ٮr;wW:͋nqCRu1y=㊻Ij z[|W%q0 CJV٨3,ib{eH7 mҝ(3ɏO/̗-=OR\dIoHZ6n`R֑&#.Mv0vԬ]I˟vrK}F9X|FI#g.Gi)%!iK|o}|ֵ7!ېATJKB2Z/"BfB(gdj۸=}'),-iX'|M2roK\e5Pt:*qSH PgƉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ_ 3]Q16ZYafuvrq^ѷQT},!H]6{Jw>%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHqyXR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb/{&Ά+4*Iqt~L4Ykja?BHX!lk҃=pnUגZ6p| G;;74^l{Pclwů Հ}xcSu)6fbM/R(*ȴd.^Qw %"=nluOeH=t) Hİd/D!-Ɩ:;v8`vU~Ʉ!hX #'$2j1ܒZ˜bK@*`*#QA 9WykGk,8}B6{/) ݆Y~ 1;;|,ۇ=sxy+@{l/*+E2}`pNU`ZS̯窜qN8V ['4d!FmaX-6 y:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;YKOIczF_/%uL8qg2nJ\S$&e;~HJ-D13$U]UT $;Jb"I ןl(*L&`6E)*Ig~jPhv1.<"*D)x"5xGEHq{WДci UQH9\\i~=X+ú5ug<*jD0 rA2s`-ţ2R,:pHw`5iA U=F*\S^WB+RVz1(3n7f|ca\># Zp0񄇃dz?1.tjBpyO2.j;i~9T $S))SR6btGlɩVsƗFdK+`o2Z{eq^ly{Y\amN,,vl/HM#wl#nF9 5z1c4iʹcødI{@V"/A9_EiL'{Je̟˗;۾K!&!Gu 6};ħs =9! nhFLטQ䆏wtΊt}൪"m s7_EOcS7s<7s=׃s[`]hdqTV^?` @X$ߴ<>m5-,@lD/a?L{7o GVZ9c=VT xo=ŖDBY~V@owdrAзD4(Kxߥ/ՉsVW,LΒVkgXXM`<ة'$Q@&L.X5MTb,bw2; ŶКwIYQkEW>1b\o96" }(uӲlmo8̱y(824i,u~EA@uAcǓ8_n>!beK'a5 3yYņ:>ŕe6n Hti+ִ7 MdP("qQSY"G+Y^ >1B.6fbPX#Y92b/RoXƣ3!F*y$'r<%x%^W|hNFq^,FIW-0el y=_RU=,u3c%dW#_0>) |b%dj(q+mg~?MMkкL\4]4%[dΦa\>%`dRGOY+%x_ZBk[Kxt< Ɔnٮ1Q{`<_e4 'Ȩ/q#䑇 ѱ41Ke3zYBѺ͹.w1蒻X0)vWHH5&^1) =QI$ەqZh-h`k@;8=H[ו8 ,INbgMg$05Wx2~<>:$amZQ"˺hTp\Ӱ[c9Φ Zn1><',SS0ۛó@yɃ2zUϰ -0P>3Qf}>l.o-skQ:g:猄ri*{78I sۇ#t u/vWUMGnfYŢ S_5'Fy :JsUl}}`sUtA-B u  oA\X69 !x) @yLLBG6ʯ[,<[VV!@NOFַW6{WuqBr@lcx]ͭm=t˼"A26,M+ۯqI7Iu9UU Aq J650'4A;޿$TqW7VP97z1p׃M%p/?{=49aO7#Q3-4/LXFҨJ_htw;,@lh~U7FE& mP- i͒Lfngy AN[tiAJ[ˋoJ49Cm.rj+b9 w cEWFjxR 0]k{&iMq,[dT gI*MN[}]^ӆn K&*g[U@l\U "V(Diᴭj4o ,CEh-`Z->앭H6Nk9[I$b_oR18EknHYy^ы o%y )γ~Mս\+QT[׶asW+Z}@JX@h .R+,B@чp WE/E/r4]2\iG.Pye=.*{* Cf5 k  <2]h`iUxU?rsyI#ʆ%k* 18C*nL=Ӊ w(`\gѭ ʤSzbWf#j7 |_b&梼[~´ !x[/|CyJe{ʛY6 b ^:i"#&,Suu |M}HP|z2ۊt? |<2qѪ,I#:?8FV0Z5T1bc!\ 7ѳrq#{Kæڻak M7'G& mblb9;q: wsL #]PM<D t2w"_AdGy󳯞4EbvW! vvaby;5=u ,;k|r]7F8]ωXݸFv]s @wm\$Pu@{@L =L^~{U,Dcs<ձ~o|fi~Z.mǹݾ1oPB>B}%G,?(׉s^aa:z۶1h6  B%@8yZDn(;3m:c$~jDXjc[w-mZ`8s;{M 笷:z'vLP$eD-)= )ȯ;mH" ;;7x']N]gHAƊiI(yϳyƯWfEKUkrn(:O(:9ĠY}MRpc[}Du%=uzO,Q?CwO U[6·dcU~}J,Os4ZVFHWO˷ G cP6/Īk$^|][Zlఁw+ǝ{e-&1Y2*vý>:~}</O"ڜ3Z\# K@ۭ[2Ys֝Pu@8΋XB-ŀղ /y`v.iڀ(.q>P:%}bԩ w lEl^XĠ wO7z.8bnܠoD峾ЗL@_ ¿wLp'BL00#z>X d;u(.{1X,=|iaqp(KX#tDɚ{&yleᖁ &4x(NO0@qE?wLhn9 +B zm `L.[wH=bp  wKv0菃)=` _б JyH>}D@A )^u{;u([ܝ2P'9XBoѮ`eE¿Y\:n AYNA0ؿx3l@")ޅ&ݰH0p+a3v{- @ܑZ߉NNAk%p N|:\uV,Z1g"N #2p{%LvE[C쥙iNM爾DUEx(<Cwt rvOY/^H u>͛PӏO߀{1Aw vp.z5nϮ_ f!!nċW@^7=>ݹ `c7K"1M_R_."4Uh~T yaB~6LQzt8u˰(=ePPaɗΥ7.֯e.5p{P*.5p%ʥ> žpbQ uܠ>ZࡡjOr)Rj)f;@ j8bcd_BeNR4ibғ,fۤWN3Vu ueQfO>}:xF:tD}/c;^l /ȉ=q/ bxƢd}R8|QT%%OoUA.,l#X$nq狉~nU.D㸇_!?F7S4tѨOQ%|CraBDznDiX ::#'Zـw'ru p01a! 吁 p'խ[G5ֳ 2pc4.GwncrҠU?+`D`!z9AUѲi f"ҁE/Of(^E{K?`H_>Yߏί+(&5t":f~{iQ^#/e8+$[[\7hORoEt!HK7a[ܾ~qvT\Nz2˳7PoSV "p}O%Z},|$B; uxd 9QƂS[$g81 S<SǕ/x8Yf8Kmf ǶMF#i J*ÍúeֿplDj )oQB 縪/:zI8XeiE.aVc4e !>}4-_SfzL(QUa^Rq0pjKzwSYypqk#i1;`nuрyjF7s96p\|F75AU(Q _'PIrKXDb6$&ݨ+{i'8E8i4㐵XF0$BOB0r݈l=X٤T)& `v |:PY0MR/aoMl]PlsAv' 6VM=RٖʞZlz]Svۺ}rnR|]Z|sYv'J7Y@PgsAu(b]PbsAv' *6]@PwsAu(޺z m'DA 7\P;A' o h.h T ] TB SyT2t,vn4UМBSuzĩ>Y~&ă [:åM_#g>3}<4#Î?dAF1(CҲZ7 :ߊ,iX ־o I6LziA Hxa= {VdazF}8nYsՏ%OC@Ԝᣯruh* q]fpTE᡻Z[G ϛo;ٸ( 롦ygkEdr8L̗Pt elF 0{S^0kOpʬ΋ %}&J5hGYȨ weU<_RJe,$mdPs ! -$y"0g$ΧJo;lj;JikؖrW̌P{JV× z!tzS|LYI3tId ?s96ZHiL`jFt+= ҃62Qi.0avhr;07'2\w{oҗj{`\#,us}ki'g4 's%Iס)')d.}`F8|- 6٫r^.NjݻUV([FX~sPHk'B# ӮuF"fQ'˴vV IAaf9RO󙭦A,b==xFmPL?Ե X'zHRBX4i@#kpGPx%#n@O ZESB3/a)7Z#ξ5UW +MaV(f.!,l ~ChTHk~_Rj9\RZW:.wr 6pŠW8@RӦVN@oNH~ȨY"\{u%IO[HඉZY~]&j>\jQ>q XL#RWcD-ГAl/`,=U=o!ZUFuxޣXBċ:PNS1)ߥZ=mUhذ_' ^.RH~giX#`]PcS`Ki:e`:ܔ rmlI~S3._~+sA~ٖ^ڻ6ז?gV{vBڑJhe묔ǐjh1)믬X9kb,;ySt{դ"2Zdb #CJ?>?tW6QQ~Oap$UbYJsQ.@7Z*ٓ#RM*A5|{czUZ[Zgs &X`6-Zq|Ey:U07ss{-ϪKRӊH3c74u \^ 95aWf dsa)mRed3YPEnߖjGWXf!rd_ē]7ZPr+eJjߨI9eVd4V#xIAK^ n.Uߡ,LE2]P惤d  dDy2ۮ.UAA?Tb,W m~9][RUl>qFbhvڈ 恟oxq|q{9eȻ޸WUQ" rI9C俧zsls7HY5|WW})?@yv7acFm/QV`cO1pp=G} Di=ڨ{R$!O<8.+p5(|㢃,-] 5׋IpS*?G+eSBxlM'[ʮGc#Eҙ>~~{dqe?G .@k8NcDע4)p,?o{']rY+h"50{ȱ5`TqF"Ig8>} ,8]Z`ZC-IAF@]-GSr<\`rQ`k2h roTD^YP$|NS@߸^kb tr$%4)6J-K)W-!V" ,-<`+bF1r1LH֘]DG/%.1LqVc*V!@Oo`(b-_4.:&F)Ƭ ]z)ZFv1GWI.ណ.粅j 86z^C_Wy$1r *J"*Z֐}BfbjLc[HRJ@t^׺זxc:\Q:H|@ F. .NAjh% 6SXKx8/AұD}\x|ri0r<!o R(!Eҋ9ND0&~k{9$efl2w߲`mc]M 7G(bؽ sSIT) dSBLQ$4}lIғ]sKy0JFLISU1ꧭ]FV1D0(.aҀl饸LqMp[ a/ ¡)ps3Н 95pRe[&[ؼ%MIBYplV@w ~KjR!b E &r h&֖Si6s6?R|l3ickM4LED6Gʪ\ʾHf) xV]Tr:Ӥt홮GwY]+k;E0WȢhYВ M D98GDf<0-10pchUY@0QVIg8V| 0Fx a#hs`>!8 H1hд" BÆ'_ Q.Gʍ: 06I1K焷r J3e 0q޵׺"WuNniX;FhLR݋lz<\(ԟd4Y`]iK{[OQ$-zAX (3թ8uP='QֵS0{RAqa Y_R:i ?EeƯ5xW_G Q +jn𬤨Z8e]890\ y‘ /ꉣP1M\7, :w# 6h=M\,ʁJKAe.Dt߳Z@Q9Ss`|OXl#Ǩ f[^q:$g7~?yg_U5L^Ԉj5&(zRİGfrRbl`Ro0=hd)Ε:H„8[: S юbhrE{$("-JGpddVd v\бhIGJ =|61JHnhfNeG5a'+Ô*<[ϔ)ɡŚVk>P(b^B}3\IךB=n};SZ'ܰc7mD|?}ʝEGٺs"tǂ6Nx‘Nx:684dko R]% 3a'xaL%> GhQNK(&FQlB",S{z1۹c䜘[v rO-J眰;)=$&Πtu*kqv0u, Q_TfBC)ּG&NN[/;JZ|T$k 8RXz.,QNqvhqюDZ|㧇q8J`mr‘1wYεglo *&,x<ó`Sqz2!t(0B vơ41vH:;?]?=Xpn2*gTt#sD MnL2߰3bvX01c8o1 M(JE )}1YTy~Ee+d'RncvϖNɗkٯ$ԳCan2=ڶŮZJO}aɠ F9m}t4##fM /,8sQ )opdAp,ٟ~Q {,oOu,8wݡĒɭkq| vH 8EʼnUeP R󓃝|xB̤~2LZ%;S?usd p&jּ#׼rC0򢆼=!n:\/NޯrYS!6/հ89^K%.A?@g}xԋ\JkE`M s F]⾞uƂ'ςfٵ}a n/J_،Xы/jq ~sz׍K-+r"Q%p?nQ=/dJξ8iς:I?d-ecu)"WV# :\ȢyB&r]b|ğh.84Oa46Z]G!)|ד3ˬ̕u#n"^ׅ'Dވw=}f\B%Y(] Ue5Hb,/XX8s]3Jsr"wϧ|ًJ>HL^wmSA;-:k.gN9+Ɣ/Ǡ/b4@8K7p{ag_?l2i'Yg2z/HN+H}/ At$X\ڰT>&Niu则a"`f-eM7PƦJ0H:wߞX:=rjIB< dk2&iQ阨ߗĿL5ɛ:M@Jd?٫g >xqiSJC^ZS7luc 5EØx/s `Jd 4QQ.Cd?ԈÈI<>bqrya-"&pDŽA:󞷦 ]^r@cvV>q, Jj6:aRQIϮ}89`wV" %d>PYa q+4fĜ".סH:#O?8dI(v-.ŷJxYl׶pK qm=lnw|mpa:d)G S $Zb[G9;o0 y, 6fJ"[Efc+ ;׮^cvɝkO8X#K; ׮~%}fg/qkG8)[`nwM&>A:AHe`RHc(A ܧm瞧 Cb`֋\un8p]TT'UclSy߮XOf.ۇAM9<~r\rY+h>s8\Mt܍6MNueLbn5Af 8Q?1vc֗SVK-`>5}5@k: ;(o?||LM>|nj숓Km"hMU#UidAdq!N.Wۜ7V乴  .WfM+&I"CNY5;4)`s0{H[|ⶇǟ r4ߜ1TΌ|]CFF$Z(C'Vػ綍,T5dFGkɸ&?SFaakʎ&}k@"(R$$UL_Wâ<Yokf j~sUc2MˡEՂYH=pC306X0L{hkѪ 3/g%*}b>\GSV ƜFMΟdD2f"15[eCN%I ~B̆gt[N^7pۋ<5a0\su_'ml/}|_|FZ\6'ȏ/_$|㋳FeLbA@؞K#q yz0ϩ]Z|&[36{&JV՛lԍF` V v+WJ"XY6Wx}s򱘞'`Z1SJne|N1KyDqx_ܪy\bͿ1_*<YLN~Cc-KQ >=L.0yËѱ׃O/{ .=/_ze-%M9Zz~n*WZ~tR 2aR,N"$w33_]YAt\NN$=/x@Oj=oŠJB2 💃^A?I~psL=Xv^/U1;r!4ÌSLFZIV~tS~Kn[fNghU`O!fɊ!vWB/Ҧ:*UIT&oU72ڙTp]}Ʀɦ>|2I87y_;% js1w製;% plUZu.]Az!Dzg}\dn*4 ?fDŮּ1B7ʶln֕m"W'\RZE`m]@6X; rtxuзp$ (8s]p(d @Wd.̍8ZވK9J AZ11zvn+dcZ&}-PU,7~L:āZWjv.w g7qeiј>s3#T `j/k#r-z$d+klw9p|AR _ѯRo1Ts^*WBf MN>w#sϡ]mws5$$uW{xqƕW'WrgN>}|_m͠5o&Jt(oKF:7CqJ LvAiM#a[FG0#)!IGD0o࡚4TLz Vxs͛X":,.Qu%&;,a i3n)itHimcvl~nױ]0Vn+xK+Nr >(msP|TFVt Rw1E;;()6wI#m3e R.iCs`4EtCib~wf*7P7Pa+ }tڼ p دnhH;ݐ5vA\XWݘ'!aMȩȧZ(XAiM I(aI,UOQ|HyCHjXI'[j9RRN}pAȠŤIt큤V.1yw+ՋaJOZP` \a6N}ժ^3YQQ,'EQ:`l.k} @qt5P# OKk.G굔щJG%Y.IŽ^$_<,}׏ R\~O6+sթZ͊&R汒ϱEʿ}8/1͌9AO w`P^+h5Nuj#R 5NH払Q\做ϷvMjV[")U$YΠSg o2r{tct* Uo64H%@9򮏻n\(wmPm3glFHI܏u-pxoÖ{h%Mx3 g*NLqLgničYT[g3U=m|V䃲R}gnF?-=bkR7=ޣ3"7 uy@ccLS ͂)_qm㦂O)Ǧy^ƚIUs|aږ>F_5rK>>tf1;{U~Ukw6LW[хɰm_]-gнO58?RgCMXa)a[cs ?af`|";V[ K&,.L$\Ȼ+?ZG3ĶkVX%'fY5[ǽ$ jfP_C{vGvɓ¯]o ŕ+{y "|ڼÈ>FkBzwxĩXww5Lsj;C`Yl@_<t9=Do,J‡2] 8fQƧ~8.Dj2pZUs3 U:R:KB-m+jm+__3ksX38eJ3G A4X%tV0V V3Ѳv~F׎-͵O|fs~|C\]Jz;>u5ibʏ/IF(f>%32eq6_/إ1iAƋ'x[ ۗ8"9sT$SA Q*rX/aRd}Z &/H^=MYTy~̪V#.l(عqQ%E0a&D$GOgƳ>yDU5cƵUR\?{s /~;p5Ch4N~Ë\S'HLOo:p+m5W+m]Oç}gkZ-/UTAr~Eelk&Ԗvy mBXa WA "ܥ4eL TrFWsG|ϣb`1~T^ 1!+8.ٷyP s:8Wc_ )N<p4ߍ#&n!@1/Q+;s />oJGΐZ[ح[!]VOĹ߼|FWa35H1uj~ fxvحh!˶"<4=Vb .gtkqIV<[~{gQewK7g bZ;o>i2]pSg{w(Q%ARZō٭%d4f!{% =*Q(d4y)F>cV#Ӌ3`&mtfcKk2(x ĥ 5(!ܰФ;\XEL-&;6HW `(-.ʆ244PB%x$ps ,!;.943WElb P_CPjM0:M}5sdYnsN2Y,8J FVYrnHK z(@ ;39wiLK # 12v@bd@`.c@\c,;~ɐVm%nЕkmH1`Q]n|{"^8BZD ߯zуHi8 I$=_U׫{ MM$'i"ac4qSm^Lhoy ˌ%ЖG;CZ;/(o;@GcDz)8J S5 .p3;oE,>y>?()\3HrU*z^'Q &4Ӓ%8:[88Ƒ80>OHGcU2P }wpm7J.Z t\r\H'D͝$ZMk=zb#U׏ ͱO{I`1I1TG`գ-PkʸWҀ}2]@+՗J+[j6uɠd-"G'7'у9)譱`#80pl<p_W? &Rxc)gsq.&#DmK阒Y!j,+mMjN7^w%r?Y93|W,8.㔋D]#ȧel+>P?f|ՕA얚\8urk\/ޭ{RRI`czӸ`[ǙCfAG d;Z 0@h; xCZ+@z0+ O94.$ FѾH"D Yw u2擁%GB]Eضi՜n΅6?nmΝ Zxl0!0G!&kGWv8ؔV2*FFr`}@s8u*+ 9P1j[(܎UGRR{JQ6:ň򐈎IE(MPEdT!ic . P/:Ľt1XR a0"ٗU?hDiDɵc7.Wݎ`d>.(*y#[Uo8|Murb|e[=P5|]DBp6?S7 7oQ+?Vے2D)foNޯCV1QeUq3T16Kb4-y3QD!#D0z7bZ -@dhA73ێ -vArByGK[;.iX 4k۠Μ<SmCcꦏ2ڏoFX\\MX}EϨcy:r2<\G+kf]Ql ` 2 <jU_nYr@LOG07˲tk$[ovdwdN۝Vp0;_~[%D/z H 0f<h|xvGdH$Dcv߰xmHԞ_#?C#f&?ݦR.l&Ρ`q^MF{?ȵDss,>Y>OVi>>D|ȯLo&/9⏗Ỽ}RHtU4f9ˤ$l4(^x"좜sQ݊Yy☊2^ÿ/1t\Ӹ'YmX\WZ#vogĚs&WR;GT4cdwlPYhnq'ˉ-V&jljs-D<$:-2񀟣9A=st崼ڇʱQ@m^wt+Crtծ^~|}wǼq 6}_wQ"ms;̖Wq@c5qjp&9>{|ܡHڇyuܭ^cP|/n*5|T??vڗC6sueE㇃:əfoNή |vy2({UW2M8e@cw ׀B^Tg^& JQ6u:shDKqa^N/o.#Nli$:)/1[&5Ù~hl[i?͡8gD׃ígb? hU^of92OO`=0ݏu.wآ=خR甬=H>ko{ΡSM#"b룍tsMv 6DchG{W[FH(VEфP࣏Yv1cɔkO>>I:&-ˎ봄;KkX_t6=3 g9[Н(r٢P;Bu G0pti-*zYr`sB3e'o*k YV%ZKLtjjh-deeqq1QnBj{&a tϴ))5]jK#=-ʎua YW֕@3VƇ5BG[ZЭ;zCY ˛>Niy2V[) Ǭ>}%ÞIT/џ9}%[Ri6hzj"-pFkO sV˝uqm-޸'$qkZ짥7?*VD5g2 *U'a7'rȋ.8#CLxb>[jYI2Yv~^ ~ 2q*y,燲v֕ϛ RxxOkuUlŕW1'☌ja([}/Q)>>>I ~@ ޏDUCOL]QZLN[g"E3题ܬKbgVwW0yh܂6w/]&\T夲Ujxy𗺙6.G5,%Պ`:m w%Z̅hߕ{)PQlgdrѣK.Ud\DX'zVҤI[q]࢝7w^_Pa3,ʱv׊k$/O a7[q'ߖWwXAu9Xx|Y?-kQ ;, qp3]Pqr=ƥ`ׇq Pw|\o1p7 U VysF\//D[Ǐb $^}~%rEߗWe+_e\Χmנ uI) ?οd2x#o nW4!xuty}}H[_}QQN ^y/fi;kF ZW>|c-s&үxH׷o}o닗R;8GU*֦}PQ3(*_K]&Gi6#ڪz5sۿ?-[8 Ӥ0^'E܁l!~hIW1A 95`R(,#&@9<4[ [rP\ ƈJ5N1,1sO6;kTAvH40si{W{1l!BfQBPHD!b0NJ (Z %*05 ; ('J0R*yJ"RxO)2 4#ßh/8XoL!KPhC)wh%Bj224E{F)MsdHtK1)!JM3$)%R^+]\G>=DBZ#@ٱ`#VG#u::~N")Eg'$yb5TqHD:V ]Bh) +u8m=xHu@K3R9~ꬤ`שX/b$PQ4jo2 _WcABk zS(!d%#F.sFJ+ 7WZ12d*qQ10g'h:l2=\ mX1Icي)X-D LGj ܲT|-I6` " Ak'ki{Vu9([R:`/TR B&Kjaw*C Y+0m@xx ¸^XJLs :,EP Ń!Wz^ iDF9 ^qmܓk"z &`Y℞/h .s5Z{Y|*MZWMC,"c`iVM*bE`$9D K]=F LBF1t^=BWvE B@d #fL1frfԫ.n\&պa2/vCsÀ K#\U1`Pugzq?:̰&nAyBt?l9Xd!|`duAQ]>7IBaW}_/Y>bd%9?hNGL*Wk/zۋM4QbG$/qOHs-;=dב. vggp"oA2oQO؏d0h̦$<&lznrzi& HD:$!tHC"HD:$!tHC"HD:$!tHC"HD:$!tHC"HD:$!tHC"HD:$!tH'Cr(Ѳ}eloSB`HœZٗ.Vn]H Em7"HoDz#ވF7"HoDz#ވF7"HoDz#ވF7"HoDz#ވF7"HoDz#ވF7"HoDz#ވ '>% V'$”/]k>INbZ:\>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HsJb)|rm ^Zn}|\X!ff|}\|N1_:_w+Aye:udaft& =FHkQ jnt72x<\Mn*M]ǻ+Y73Koד!2Ogׇ^jp+QΘ]YxYp.qX0Is\}}->:ڏNi[pŏO&W~`6 ?ſֹy7`l2狿Ñˏƪ eV>{1H H&8;_=Mo/B+1\OFOS CByV;inoaU(5~uQQ6t.ٯ/듀n.Y8ɇ. A|}nLtQuzxG:Ò_O-d~`*0Lk5ë+mKkN!@qX0%Z`)XjQrJWW>1 DOW0Sր؄FzT9 RTkuO<Zq8|8=Ƹ'j`1n@tJ$G$*e]VFF暠C)s<18F/;_$76wNL'&qz}ݛosDD|=EhGu?wtώ\\ jOeGr(zwuU"QLj!e-؍TT6 _ZG>ұJ2XW:NGpO*~ޠ2ur߫^>L'KeqjZ i/}n5Z!ٯn~x[S=b*|1 }2A¯ !T~)J0I0|2/qقTe4 xz??ƣc7Kěl.sVI/bL!a;fx_(ZAa`£ ?>B ݣ!{3:S8om#%SM7JWw,>ԹOuh ;;~hG}ihgn<ŷPfv[1}Ge }BnvC5#NNT#FPA5jT;9ݚ:]-uN:/tjNJC7Z=l/>UX*iusEswˆ[,6MOlaf%޶Y#:%=ZB| +wБ}v.%r؊P(v8ۜ;H=]!<,}S>TO8l7s߹G|s\M'!s6b>:or~uLu-}Tv=qluR%ܣB5NmmË;Oڐ2rN[hp7%EnCc^4Apq?eܯ.B{Zէ ph)>f%{]Ft[{,?Vw/v?AꌿCRZ8e-[=)U<B[/v`}b;dVHf[o6[sa5lG   A{a|،so! y͔ W$:հ4`ZlBq$wm%2t)EԳ-YpQ~~q_ گo_yuNp0:}naTxBbP){" FsO4DsO4DsO4U#Ɖ4Dhhh枞)-8 h806wG{^nd,wn>X쭽y5vG] *Tqn,։F_\iϒBorȭ}{q s}sYiKn\#Zƨuc= v[gK[$>lo8|sg/չ9DS$t:7! (s1~Շ]~%wawx岩-ֽ|pUό;0ƥBm!j2[,^9౛ryr{VFֳ?>WI=/sNư26`m#WR">KH͞Xglˏtb\GuEsQVĬDRn\ QfN`Kr߆kX>b:VP=̜l}mF!MP24>R۾K7rH7\P" ߯heR塁F]dX-yz3V\Ic[Nzh<rXsw⚑2%o(QIEN҉_>31c,UJռȨ% "*0:ͬ| 힋(5լ  9X)ń>#VC*s(qԔY};XbTx7&zsg>6E.hoMt&-~/euk{h$r;D0{@XmJkdZXo66%>ΗR:G=%H$ |$3) (m$Aߒfb: [{*!;Gd[i έ( r/bl(Uj6I~ozh$93\BTBqq 铎rw>TP@.dg|>wz2 \W~5as:Y}5|88߽Y|_חo^e \۟/yR%]t j.r 6uZ҃{8SePB]o#gcptX Be^4xgځ\HJKN;Cr//sj. eh/] |,F>F u3+)Yl{GFH8D aF})°r`1@0g9/|f\yZIeJk2QY64 1Jse3ػyڕ(ӹ~,OBy˘sSLEej&C.UNs)IVzxW8 u 0:7J],Q̐7qcl:CY c{h| )|,X+f>]~jPv[@t*oz 9g̜F{_ѽBd|e Jܺ LQh>_jF} S{h l,Wq@.J8ݔi؂ufF8>%U1w;I|(V$!#fUԵQc:a>3)[ѤH0;S& EElsu?@3FɆ)K<0[$e B9;mLEf=UIˬ?3fX6OalCYdZ "$%67x1CY0c+Ste~A9zh|Y5[| Qul;ai~8:Ǜ򝕵y1O eR2O&,]"u$w13JgiY\GmDG7>2$6+Γ|$'P'/:$Q"8)X,$q`ĕV",q GE޾&O:%gߢI(#ZmiH_舨bo=4.f| rf@7L& KzlM1K*%@%ڀ#2+Vr9Ž.uъmFe)mЉ}&!/AY⮀8+VuCۘX鼰EAa~481;%"%uĔ2Cqug,Scoy bQH8DtcWR]NQeJ'UkRV$X\pC+tGU>EPL9B@E~AH0ԈJɒR4R#J+͹]Gm7G7>X {TFΟVvKAay,V ; Ќf>cVFke#ptlX\mwp (&2%svz㥍z $x#0eKP}O] FuRVBm\ݫnyŚ壜O:<X^wIΘ(u1 q{h᜝N!qYW`y֎ѥ /i'.ҥ-A~0c?A9ܺAݴnL.2S֠mV져j~R #uV{ ٬*{g!LXg%~99;ѨG (3դ3ocK"mM6a}S薕˫@&amtbz"I;Qvyp>ur?SQi4N45d.PԮUQ3J)|4`"C:HSQʤ~py)jئSy@n+ߝq|cL +E)'nHPm!AOIXu5i6%mfh&A㮥v uW+I6 Y*DJژWakiUY.aj_Ylar+KƸ%9CۮfjF$3,tmA)Yߔ-kg:>#y~d/x1kLދ^ dA #?OeC6>l˸@C5֡%98꧸f{r=j9>=¾/OSq:y7_r7'4>pyO}] 헃{V\;hO~7q9[r=Z5-׾*_ʗMqkO{dHQםEioBoWŚ<7\m{x;'a2R(syВ_EnWx YFyL]n}w_)N e g?@1ce–fq>,ZN_/UâZ>ԋrۅo佤QN<:σ]g^..JxWƅsl{جpʺ.Lrg/sm_;x 9X|p ք@cg\3c"J<<V 0ve |\LؐjnWfcf!`?2dږ%KA"d)ń6:AZ'(ܞ1׻s609&\FʑلW%AzU-S]"OYSl>* ? ӧF})fmbFBEw|o!u5;Xf|8nxd TL fVI[Ygo!JHB5t>k[x.{h|-n&"ESWUjm&A&4',b%g,D#2g9EƂPho`FwC86kiw>yq7|zӪ${'~]c4Z0=gБϬ$Ajw_zo]0i0x& h;_]tapӅ.g+ҙz^yh{8y:䡀&4-'^{ 3lj9X>J&`]f'T M8f {&'5msF7g;鷡30 oo?ƗΨS+ym=4w0ǩn7u]s?E3\1”yK\U`nf@}S4l&y3[7AP="^IKYǧ?*`؀`o'b|>lPaX]ͮ] ,1eDv8No;e();loڳe\v3*N K8mZ;g'la𶳅y ;e2GC79>HG{ik"7"Y1_ 0K %APтXkq\G8Or}؜CrAnOIR=n؞9俇ږlKvQ`qOŮ⥪XF 㩓%hx ũ'Mi!t -mUA 74svTwACF1x=!T;ߩ!ZCjSɄ6zpoF|H{p ĝ[m&2 2z @<ӳqѡūcq[썲ܖPȈ #@ܥ oQPPhmZvFN$g }#tdYA?!#8Bs|+?+YQ"8~uB 6iLbm 1u{H|1m@ +ûZ`V\ PVӘF: ijdZm>_a2Z`uvgtj8 DiM`CTH9\sf>Et&Q乂[! x~wt\kJ?])ϰԥـ$.zXJ MB1A1UH.ᰞPQTWB 5,iLbX9Wu\l,!&ec[G;rqBp$6_7@2)B6 A .>ah)Upcyb&||tVfE \FL ,FڄЅUt"@ {b"|nGxFGQs"=K*d>o=z%AZwoG(rmܕʔq9daEeqw7:d{~-IPzшxkS&Z6 JMڹ=_)ա 0~]I!rNgBP=bInUl< ݼXxα"XP蓨!;qڪRcBPc#L+&L+|hȘ~$#Թ_U).&^X|FBeTC#24Ps 1>1B2"!/ Cr.O !, mFe&|;dZqy1rC"y̵~UwYOM0]$o3ڧ1yv< }@S'T"NO%Aw@M@ƭ]<&fٰ Foip2%و A4ON={}SO5 JIW5$@5D"8?~t$ 39agAܺ;v *{}SRSs|z{Whۧ1FY Kf/_0l›Of*bd<2l/KD,y}*BU"njl;sVbB^AυS؁5ZQ*J1T42h/UGnVï*E{vloޖ]V.2UH a%A<x'_XtNd/R=(r^lRO]ĿyKoqy.W DTLڐp@bQcoJ,¿_Ql֛_6A5Y/#J}Ěů6GIywnxoׅ,fʭ?:.8T//cXf\uC<ՠ', §MztQ7uP]:d@355&@q?S> Y{[0,! Wc.uiLrJe:@>Cg5JZljW[<هsϳ}?m`o99BkdzYRRTzTF҂ [xDǮ׫Nu$B|3D/إ~ـ&` o׫PV,{3(eʔm "d$̰{P!^ﶻ8s[?nV V.-"QoR;$]xQ .*{}S8$O6b=%c}zcƸPNKy=IVB;I,w096WmY|yo:>I!J .PwO4}h4gPKaEaz>7!@0NԜp"s %(;Suv-Oc H63~@U{Q\`f\r\JWכ c€_A4jGS4.C]09]vS|lG&s-KSnAӤ,ܸ{ۉMH+k7_ !Z4HO i6z]ٗm|I鉻e E #̔%r.9s>Pxp!6t07H3^&xPQT/ )"-E2'Xӱm66lX`|[:eߠӽc P|t;O@A0E2%a6\m|82FC0"9ے`8C qs^ĿGR[y`tG?A԰wY?E-!A=}7=]uFePrsn`|1Әd;=ҁ|s-Q2>W[cH:O$Ĺ-@aDp͏R}Ic./)-#gegUQ.au%V!M&Lhluq|Amxi0a^b<{=$>ovIawց|l"(Tu޿== V gclC+LQќ2v86l"*r:^^"}3(\|>Lp>ީyӒdc#ϚI@8>[HͶa /) vxޤZg8gFZ|+Є١ڧ1E9>c{.f"RHUx}{]"@ ;ȏNm '@ܽC[1y+ȉ9y>_/ooB:;BZ4ئB`SLM"nM03eFTG(|1F fJ֥2QZԤ("AL "ue~dyBk? @Mt|1[V0+[}xӠ9]z"%Gf|N(t|&HhS.72" 5P!< TɴFi?gml/ :-/o`s~dOc\4Нm|7}LDPBi(7 r)nVY=Y`gX|E7"LcX!S{9ROT! '-c FVA3*IR+$ۄF) 7A6A1Ux(n`ն KӘF*;"H˗ήVYׯTEvvO%uh7Vݤymʭi43ktYQgo> KU[oW \R\u_ \N#.əٸxۖ2O@M ][TNhyFphP-8 ղ9aF>pJph}ҿY }ŋOnHwHp+PL|#&|͌ X\DĤK\ #=F8Y*q Q D>P(ԸoW_xXUKBJ,'#͍@.=p0Lb_7 "*f1"|[2F|߼ACEysA-Z?|l="(jU>F[GÞǿ"\ qu8h␂ȇ\%}~ rGW ٖϯͮϻ}pQ*k+WiA|wxh5 ":1 vS|:qh.pPi'=ѨAOgY1bBxީto4⋖w#lUű?v6N_Scve7 ==~ⴼ˞یrQǎ.%mKY|"|USOE[!j 2HZ'|-GVI&>Egco#PЎT aR>0E:Xk]S~%{챮iL LH7[xyCfu,DMdr&o>rl^vm;qYnsNt0x.x u95u+x|^qm7(C2,A  '֚ԩQs3W>_T& zO"*` }IW=O}@;tܟr[I? Ӽa6xn5~2Ol "XNQl]C78hf ߟ@Ⱦ3@*Ǻ-Տwl=V?qAIK52/:+ @Xߏ) ,[,"Myw!=_/`q\vI7mc0X,f}Zr>p-u\-wۑPP*<l A41%gQcq_!qO?B`a_P(j<EJIa^p俜N؟YZ@ Һ#J C<Ez,G"25y$N(.yEJ|ɹE]FZ7c ݂ǀlGa?J{<=ko#7E0v97 6Y\>l$/ v7V,ˊ$3^o[&v8TbW*֫.$:`eA\e+\9/*a) lU<=pd2 O7xͬ X|2a$JMYIJ=݋Zi#0 d65(6Z b3z8!sԂbΌ F.YY# ıvpU^H3PƁZH+Ÿ=iL8#8Gj@=8=8"1$/_+P\ U&{K5樊=NQQzxS,f}}3Ӑ0s^DQŴ G\h:UbF\ aR߹SU+'͢ꡗ zGlRFsn9#`d3fܗ7lHwT>Q"?J-}抰R6Ua)\#G\X} O] Yɑ^qY*X1(8!އ@lH{/qG Vk^zd kDI_RMv>e#*q$JN&>9&:q9Ry?}Vl ^/!Gdp\i0F)a#S!^={xNn8r̐T!36K3~q}1{;pD&z47N8Z ͈\;pD&ޕ~VoABKL*E Na 4/aGdQ\*Iwb?W PY(8"^S"ʘWQuL~K(AHBw5HVIO Lҕ :pD&7*n<py׆b넃#SClC|=SanEJoЕw[y2&U"VQ^gTQUP|(Ȥ04t+#$ 4VYf16`!y;bPrh]iiDc46HcFYṼ^DsNz4F(ouLs{vrrI 'Q]R2~92CѼ]h$޸;2h d};[n757`ECpjs:@OeG*p9lFޏ+zo|ݲ˰BiX ?j!˧CP_Ma%,|3l.z/^y4/}[?-|UizJ(#|ڮ?'_77[ݗtVb4yw7_!6G_v3V#v_©TzgԱյ[97?փcbwc~羄u , I r/Q |?+ǕXH"+R)s7oEnm e)BUK.=wyާ7_"00L7<79|T?󟧰@ S?[MlwQ~෻Y &z6U('W~>Y\n $Rn2y4>s0^e}ÛsweZs9段_^vmͺPellkW) ՚r3FՄr`ywEfxӖ6U€Kq\MÏ 7Gfhha_/%``f:^ar[t0وf@&fRђvsR<{]h-^/V޿-kxۻGFWa xd],&OA?23c ʓ¶90Y:+u@hwZͷk3Κ@wY 1,q{^٣}BF/մ7ib*; cfqwy*.Y5#t"'aX'W Lod9"U%k2ђ_5 TD]ީblw: KCɰZtrp!{6hb! i\W |;fL+<ӰWǐ @r%UqoWNb8ٗۗ٥znzM=71h^=Cz[%g/'~nrJˑ ?Ln%_irQ5SҗgGg  hA7ocr31&!G{O6Ì O6mZoΚ.]` 5:4$oLye3Sƴy w?Pfn 抪ݳ %{~ymя RQqu 6Nb8ƪ4QRm[ 0~a=ppU 2=jJtZ-1J *(fyȮ v ,=ӆz+a#jr=ip QQF:ܪmU<9RTJFnBSʯG_2^opČj8 3䇚D*)?E %M &# ocOfcSmDHc6]*3JX.:I\>BJj|qYV]S_RmLc ?Ze-w˦v55nqLG>LNٲ`>dJ[ѫ@!_&m?W?5[֥2S$(E$W%W+ύkjb^n-Pjd!t6gelI"Qjɟ(鯓`Jqei6z 2Qެ`"?L/7+5x- \;$ey/;0r¢WPՃ[Ze&ֽdL2דi5C\sju_V@n jW/.@?fӼN63}|k_bjzN`aݭYNFϚmt*nun96 c1dz $v4Pf5puˢx\E>ܬb *Fw= x8|{iPnQ7) 7w\~ 03yxELt~(ljC)Ӧ ԪJD^2K^QZ57 vwf$eZZ㤫H72Ϻ&z42\~Du'b{"C2tnHn/贊*wׅ$2yuUǮLP7矯BuY1Оָq{P.錨"^rKG1ݵ?O!F *[ )J:8ȜA-.+o.zM 7QXm31RqQeP倧$+Gַ~g`ɳ#ES9*P `zQaF,bZ00IJ,yF2J.1(?l 3"KXų^bNPf8i \3'}XM0+x&ԑm̈́0f/МL713\ 5D@ ETVPqOht+äڽmmb~o3fΝY$ʪTi?~,?Ae< ?k|g<4ư2a9qAz)4GMǧ1JJ֞4*`-EFiaLyrOũA| 0:({4Yʠ9@';>1HXV14'*9UХ0>r^Q!0h9ZOcZlC[}o`lFR,{l7B=&ڂ1aO+w]N٣{Ǐ1gAW5\]% |Q_y}e&e)m!hv F9JuH8҉vM buI ʖ<{'3IO$,P2XOq3En*Wh &iB&9(y"#2Έ4/3rK4 zFlv0^H>e ۞N(Lrb9Nu[v*:itöiU@o|{*{LcKDf0{1Şb{&mL_u 2WM'>5sϏh@>^| [xWU"d1ߜY[EVyX%,ʅ8sɬ9%y6p\JzM7WxZ&8"SPHU?E11a=/Ԟ8 t[|hOÓ#D9(iҋրtsVxP[3E"\qt`%qskIN:BAGLdퟭ#AU81"Pii) tdG">JJVQi}PQ%}PCO/գŸ c]9],8az,Թ|@j 8X÷*-#$/"Nk&(d C?.];_g;q`?2Q"x>>~dH6h=c+w6QzpyH,;b垃 |Έry)9398|۸k{ZM Cv=:|dd9Jl"Tkָ`GIKB4$KK:2e(|dqJJ TscـmM(*GPIi?ņrAy,:pD&`Csn2A+`dR҂$f$N9|WZu:pD&f#^0#h "](v'pV'F'(e-ʼnʜ駹fw%,Yv",ϕBI5Vɖԫ>=ZԌѓH'Ҝň"-$_^03ɩ18#OTof4~TF2 +hSYjEܙu>$gx[Ʌs ppTVLaoW5G]6m|m`le3.9!? UQ7P^k돷WV?W&O"S$kGJ"`09H dFr_sw ~3$cqk $DA)xʹBl!~JhI0 rF?)pwr9J_A4dΦ$zˮ)i[NXa@0쐳"৓;}iCϬvB >5DI#enM3%v=%SzO"]C Vruo~*다JW[)o+5?\[/9F0O- A(czfl 8 Თ$Nd Lg):,x|;yr㊩gYs£z?aa6)_g`s:^EWi-.%7rY@i/duVi{"W;|0gItpt7\/^7kvs7V)>6$qQxzwxLi4ŢǵhX/J.[vנ.ASXh&lrLoܣp>v8?Ԛ^luUz3h$nU^?vy@XVAӔC@\X]e0?bR/R$g2&ˉ/NcC~iΎ8ʿ+PRؾ^%;.߼sa ݘ9@xfnwyA|*N㼧WO_;M1N ;}LjXqR"XM Ϋp}_rI/ؾLps+ q6յ)*z;y/q>y~7J>3O_߫AZ8(AWF^)r-,A9 JTP9u*p4.Kꐾw+atL`M֐Ȝ l#-hvd 5hCw)Q >z^P\eM (-)[[4} 86:&98uZX W4):-_uǤOYzxBKa[$ fc- c22u ;6&9֫sژ 7{6:&N yɧqe@?*B3ܮcqT~DV4駩  Fw{1%4uHNY0TPdz1SPB5d҂ĆCg s"Vgv{(чq:.@`DMLJ99h{1<@ ]-UjJRGDVt嵐ӊJ05 цc 21)%wKAGjEQ\ mtLuٍ־8E mtLQoX9E mtLƞ,H0"1:%iQDą)mc8}| 7{E;51qMʖY;0{hdiA> CI8؉}weq_>+ I1 -#~!Bi I32(Ĺ\Q82 .RN^="K6$ŌY}g<9ݒ}'d pU""Ւ7!nf2rqvq7&};="o[owRz@J!T}iN֜TCƌwk~F,6Vh (cX n,&@8lJzc0$IȴX9P4PD˜(2 `l(U4rCVƴOܝd; =tGLVeٍU: &$ȗ~Ow{2%׏g]yS M1MdխsZAU& eE 8 F &)<$cR>D8 F 7T0/"j&}?}~g [M6E k)*,3UVƩZUjb`'Lk*Y,SQ U+G4qjJ̤4 ͱN:FtaX|Zw _γ?]=PT6,`B)%1YP<*ıdcSde7}?LW$J Hzr>bw`ErX`Aq&ʋug;NZ2o),1H}N<'dYO]ou+Ҥ^bji;M8̲JAh ,(F))a/G+4nGnL`N&ao][,S34qՀP}Gk%bZ08~ ˄ҘD q4$*济^c%C@n'J^=R%r(񖙏ϙa3;=^I )!$ ˅Ö^݋"{,:CCf­QޫYa^q 5{ \ˮf踴cR$2" 6!R rzgߏzS; nqIb͸yX ]o40L(x)0;b< U\`v_syxrя&!Ox搴`gWgu8K. h?۔ߏ&0wѷؓ*v\w:!}|Qed=(2 qjD4&5:} e| "|KލcWjtBq2H6W)J$>v&v.yFo\N{&Aj# ,VCF@(hԊh(Ky`k)xEoa _n|$CiB<.hAm#֐`*kR{8͹is-ZVlݯ)EEFb[6=3uOO768R$#7g)Oc.*5X@U߰oAǛ"~z$7-F9z亶hsæ>3ʎ&\@ ,dN@4|ဗsՓ 9K3\_ ya)FIig *Vܺj;y_I^NJkp2ǿ5/(\ W~vtxbWef:!`φ|0zYw=BnF ^x uKRo';ɢS\~ڗJm w%nkNYu&.oc$]sL(QmGsȰ, -J)~)$!_gcypMೇ7RLҕl <mkNaasB5S\FoL4Gd!rFBB4L`D$-h{Dvl!ړԶta"{&CD{fhp8%b"XZYh6GdMقCb!psW AV;;KSTШ|UM/-Ts†tJb$yz֖ [vdD{S{WHS*_ 2c.* LCI2gI1}X#Tzm-Bn)"{@0n=r<{8d%uEiKcB;@ժֲeA߆YkF%H xL2J͘ѩYrM0&{v^0m>hko\=!\mWv}qt,ll{ T uT`"O&r9ʩDDO=<:K;BvF]%rsUsi;_6ߙ_FqƸ@(_#pW0FV)_4jg'($.v򭙽Ua_ 鰘a۟ .,VxG=*/SX )Y,f!Ynu '1bePY-5w/z"S2/ޖ& W]{hg+ȃjhYoveoBj %\,RG]6_|AD﨣rE8(gK;z|^MbVX>+s@/^`G$o|a`.?9VLkƿ,=J_bI1E!l0l-p*3E6/(dth m v :9G`GG`'ޑ>贑0|FфJ^orXRQئưjB(b  ݾiO/Hg@0eR*a߭@J0JwZ4F{P{Ҩ\i<->V+>s{7wCN@P}&HeLK5Ѵ$AݓvaIQoEġ%'܏cR&ek=BM`Eّ+ ;*0ǃ9myZ@2p@Pa&KJʵWNl;Wikժ~؂m+'0Mk r5pe dz̙K^@o1RyӶz{۞ ZΫ5yz$ܚIBp8icF4Z!:C4g+! SBwRmIdJ Xaq$ [WBj Y=3^EXLP 4qg@zfSُʒB@Vt2ގ,kP.~YK&YXG$+Z, > |M%IR2h>LƳ00`(j!FMd8,th3L2!˓BjF-ƲCj:|:'$jzj$F5͸`K ;*KEWU9DUR^]}Jp {0a?~qo߾zu1O>ϝ{)ToWy5uՇ^Ҁ<0:rmd7ZjIKX:kH)¸wid|,_8?'E>[%4(}6{EOe*{r [KY'yc?pRJZֽƅk%ϢpnfV*na }i4k|"LV-y)K<q?'peT$/F\?H #G jMUf;&CHep ᐊ#BzuP9UT b:!ug ZGK!3jg>LLj9+7(i=j&ǝ*kGvJ˼X5%ɞ+jIfm'!HQ1DYTA[lĒ80۰D(K cA4b5}u} EيUۯl Nw)Pʍ"6UבaYqGkZ Hƻocz)vnMUh:FcwEUJ` \b.1/_5kwo*|%X9R[q'0knT T*թSٰ8\YOɗEo߿^ f `=Rb2𷠹x~z^S&+J*e;L2Ýg:{Tp& $MfvfThE=V3SO"[YNp ^VґfSJ0:)Ӝp?u ôblpGg dP"t5R-Kݜo="b.X5Pp1OҺHNs3 &X.Mzq{ AF=\!&GVw%P떃~G.O)OSSI.?߿zqk$_N3C6GNhH /r.1u kV( NYDt_ΛLi^䵺~XYꇕ~XYꇕaձM;H5Z~NKuOk1^YSsh5_WR&ߧ]fN1e*dYTVL^4lgXtgf(? >qUP Vi23ɬ`)$ŤE HbP"LCg0#W#OA(O9Id)fErA9HSLnnKB5G0V ݛI7ÊC0ɽ{GͶ\n{Y*mylLߥqcRc`4D"xՋ_KW=م~ KҀ%+Cߖ[y!:C`(X *.1 ;` xr[x-7Z3rOmߺOX2,;Q6^]QJ aN90Pɩ@#sँiqnpZSHp3I+b88/LJ̎VpPV# $0 Vrp3[7#OL-ФC+v=Kr;7=yN=M,z2aFh"tȐ&sJO.}5? myrW-Ov?:ln*AzHrPNrt:> VR$3GBZy1HO"%5>nz~UڠG{DŽ5Vip')>|aIFSV̠h$+&-FF4QAJpD0ơir95,/OIOo0*\~ Y#tygF;sa%K$li"11؋@"pw._(z ƤƑHwLk,yOb pW 6YPrO? ut3<0 #"Ӹ^[dF4A^оRZE+V4>6Tʫo> OEUCuV\F3=ލiNc!j5Q$"R!blBcP)х,w""@Iw,_k,r𱖅W'`~a!'20+i- Z]rzpzۢx,XƷt!JoY:8Mՙ+[ %oɊ[O3vZ*KʁD:HHaSL\m8V9nZe:_q2*Nu8maxfUю,iiqPoDНMȕ+Igۼ7 P{|I`L1}GXlGDI_ˬvz}q?opt6u}JE^.;r؉mk^Ntasߴ~R)8c}J]5|uc< BkeHd =!"}ڛchnbi1g;Jaظ5| 'W4 i=@5ɑͣ0߼cö)y۞}.]ؐ;q)Y$Or|ˎ̭^UGhֶEݙ ţ≂;@OTdoFIJ/9EH(򙖜eN)`Qfx ."la = N@^__;׿ŔD + U~0)np0sHg[ 2S?{Wq¿$yH} }MlvC2KHz%RjcꫪE6ze`k.PHbh՞{:FY^1Tk%:]YOMmj @\[}4K!"f~˶ #UаZmV\41rKoB}]xM)}(Mumr?Jk,Uq+t9VlM\U %be5P6UlCK=VVn,q$z:J.o|YeFMR4U< q,+ sts$]su\5Ww]su\(Ww;7Ww,Ww]su\5Ww]su\5Ikꮹkmkꮹk>]6׊;Ts Lͱ4\RsLz5( 5ǾšcB\:s&bԹufŨs1\:R su ru\5Ww]su\5Ww}y@O)p!qd Oz=(Im̞HT>ȼUHR0qpk wI>(NN7Ql~cuKŝba/!ݏY ZQ) UDwi5g.1w6y>ErMl$f( )ɹxFlDtҊQ)Fw#=st!S*wE W8e3v3*'$ٯK/^[_V]qJgt  0%飤,/r7U|_NOׯLZ$XuuX ^3CoY&wU1J s. _ڹpgcKqSJ%Fe/MsWIˇ7-@ph_$ N͔ S(pzyGk܇dwƅ?]BOz'˕`^ž/?5ܙ'c7ޒ,up}3M]WW]U%5 n2CzOW./m"n]IRsZdXAҏ`)"wF J1uf<vb<7w{~o(! H Iv W_`!zJNT ,ͅ$uSyt4ΔD)og[V٫Z<𩩂OokaqnRY-_wSi0yݮWK 0x#i =,)C4,ih ~2wǷ`u|iƽ\]ɪ7i}~/p;7nn&d~Lcy8n'9WֳɾiW:^wUbkg;$u _F.5 `7.{ÊFӨ ח̀}ރ:}&?7o1Qׯ_AXIYbCi' `뿾9ko5U쐮y.!:" +Ӹf@t}ˌިZ,|}TC2aُaV$ 1ת \JIB&_yfa@eP툰Q%`-Hr)$R/R:hkrf\R" `MZn<+Zt>>@)&nn$y.bka WLb\YX,QyeRIb_W}}'x| sfԁN bwBOT`sMgFl813qu{Wn`9}tӒ{8yx&GcFGgBG%4`;Cz[pu8v3%=NT$R)NaT{c3vytcCup UrC0lv~8,IJRn'}6j-1Zt=a}*CjcߛϠW7Pr x(`‚. v~# NB^[N5J\+uT!I`K6m P5(^L2Q8 ErU^$*k=;5my~qs+[@X(]HJaY+4IgIA|V'wdVI׍nu`P_U@~Ȯ\[ôى{nxX@yw1b? @vo6KQq~sڈR̈́f SpŒN~׃݇8\q^Â1 aPRum7*_o.پ~>{o{>qb -1~2wiiW8n_6 J[T$Mb&%}KeFQxs{-)eU/po+}r>GI :jX'_%ZT̢Z4;j?;ue& OkL]jxzM##Nk.V2^q3eu6ٚ(vt TWFoO-M[tpʠhYi4`:NmT_Mi4Z(e;Fm0!K3_cZ]\Q=Eew+sl4r6O1\b:߸XpEu^>i].PŚ}|:%-8'"zfBL =3g&̄5-z碙3zfτ3zfBL =3g&̄3zfBL =3g&̄3zfBL =3g&̄3zfBL)4K3zfBL =3g&̄]EGSCGonĠsT פA R[0`$ 6;˜d)ƞ"O^*>uSuSQh^B\`*`毘օ I$x/m t&> K21?GHw|ҁL.Wi]dpi#ߓ=8qw#idܸknO>)/U/r \RYA[Kvpm:ی΅Za-`rhX]rI`k|=JQ}q}dޛ`7Q*H5`+b ({%p(%\/+/=) Mm,~TueP^ZĒ-AFk+FG5ȬȄcXg=|s8.UXHR0M5fsŚ4Qq/ R3 ;@nv3=w7ޕ#4?_8#g>;xGj*bۤ/~LzU`Ķ.6@odʈ"DDa~f"a4=^S& u彇uyxvX3[n""N? `hQޭH[ bE=?h15/Hv` 1KK06yG|h &ՔQDn2=̒iǟW=t`[NnM3,.˄yv|':u}P9ԸT7tHoty|ïXj]-LV{^A7 H/t>vl.ST`x:[;֯ۻ4By>9Q!$Am>fzaRE;P-q떬xGCM;'֤+ͥ6Z\n9le0x{$|81"< K7)F/Sb3会콭.[B&9 f%+EH~|SIM<4l+1LY}Q={iGっ&mq}mJ 4υ߿$; T } 4^rÜE"S+AƒE ,әeм+Qw]I\)Pʍ Ҵ[GEdQ0A[R.KH!$SH庮h'o9rO.}O50ߣ=vfD61[\7[֯Bl *X0B`D$)h35[GqF:L2Ýg:* Src5_8< SDcK)b΂wDðqG G`GóuDXhq_}|ĘyznkmȌɇy]@bƒGp^Dc}~SPXE(<%o'n)z{ {w.)_nW?;@bK8O<})/RCimZvKbUȱwh5Z`(3*h2:ڀmT 5 Cjc:S^gDjZ05B]`".0wU}j6҃{^*Fxx˥2wd;x%fAY2_ܻ^Ny]I 2{p{$>?{6 uUw[x?\֤&dݹ)e+RNS %,2C!A<ݿЍb |pR=Wo^'ק=^e8h'|Nv:5] ;gC:x+.} ˜\ZK롫Cwf!Z;X`/'JͫCC /tē[:!=l^gq2aQ{ar0 6ҜaTKRp 3<-=\\,To*r !5g&al˜ 68lH7m!@a R(cS" ~xXn酻=`*{,OE:xbqYF2UTSmEWpI[gB!1vyy9˩SL#,8ZI q8k=ĵL Ǩ{ 4,MOi> (PocB=IqV1$QH*/!N|dq%Rem &dŔ>!̢,|a|C@~z1bs>" _:oR:2%,V*0*+~T3pۍT76\"\ eFzg eҨ]wUbT,p4LpuUZB(W~[]r!Tf|*0ɠfk/ЬuUH`_t ށ( P壊'tH8 6УR|7ɛiȎ1"8ec%UDE__%W/ c$gEJ@ P6!N7 {ofM?N3%evEd؅vv+ ii"qAIJCTC;Tjjabۏt(^Lj );QxkW\W܋u5q̧)|J1Ŗ)8O)ܘO)γO)Sb>O)Sb>'|J1Ŗ)|Jq|J1R1Ŗ)|Js>B yΧ5amղ؋2y>U\wcXfܸM:q@6X┅`HCNHKM۶K^tii˞mi0^aN&G>lX'rvǀʿ^IO*-I1 . ݗR Ax|7=>;Fn̆C alq[r5u#Q˥!)2F&Tsn\n\}PKR\^4Aok0F 9T(T8;XVKuIjiɹ}4 mHWwCH*BuZm`Yu"ڟ&%kt3 W@W!S*(~q:1dCFb/m=cʏ IϾP\}\@`7+}8}eu2sV*Vg-NX<ճzEx&Zp2_W,R>p?yrWe֠]1{IkE>(]V~,zň f#Ɂo,$tZ`ZjŨJ5@@mQ+׳x߫[ND*kdYavJYv$8CGa{xBV>P+d ՟nr\tNčVϕ!j5joMqTھϙm tlXۿ;+vhg\)zRUifEA3DC`:g@_%;kv;۪[ X e,y}ӧ۽7: m`LxPQ3A'Vu;ܪgH`d?S01^~>cT]Dak}0H07oX̑AǂWP4]IR\?&ʁi!z!ɰq 4M^>>M1?ϣ7Bz9 f~MzXާT>ݓӧ~N"GT }`u3, o! ==rGa}[/ޖuFH'.;*ޟ oY_8}'·-ߓ rbzb3= ;Hsb& Jt#{}&`_Cfz71r|. <6P1C޷bXzHan ExʇcnL/4uP +=ixbZ0ϡ[{X*;:M mqjRƕHy+3lĒr_ߺ滚_OgςS뛺?Ro|;u?ynO߅xWIi@gE^OB3ۻME 7kak1;x/Ոhll|<~|~f&EQ>E־_ {xJW3.|ro=Ӫ)lJ$Gd>:-yEYKS,52fZ6b'V$̫ͻ2ߞ^57i1No@0ƒ->9)RmC2KsEr0dqR$$KOՙ5i^=Vbbi]S<[<+ zQ+G3{Ų˵m@ W[ӛJ|i+hqoի4[5ܘl>Y<٪#4YI-"d\ӷ=xjbtj?ߍ 5Z!a06e&XSL)MSu{{E`wW]ڷJX:ϟ;-FPx7fq@Ugޝ/9X3|%}a:L.TQy{4z7U ,8Oú`Y^ggNϵd S C˗"p۵Fm'F9O>qۏH&*í/f0j2 %19')R)a1 PNၑX˫Hц6wHԯ_k! FtR$&HTWLT/O퓺˓7{S #x8ۃ:,%oC/߂(3PwQ_o_wPo5KuX/[ @zZI2oa/2τi\W伆'/HTP#ޭS15ܐûR&(6;w:Oԏ_Vx*+?_y[-X2j6BlRVH^yH6_@ 74BSwv4&*h0e6?|~41)ň1oV\emS/ʼu * 6J!f^43Žu5q+0:_=;m޵B;S\s-g&3)rJ;+>s K5Py qf -bc=4ʹv)Ge6Wia<16_;Un; ffr^|粃߻ua^Ýmaxma ǫ]1z_ǫsCGnd.&@b ݶ)( _&dH,e.FryJ&w0:i^_+HWYe%1Q 6Aj!,M6MFŇ4{U}'w +d< 6%?IRB`1?Ep_+-tϓ }'vkSۭFdz~YBjm||ہZzhl#ZHK16=S녴)k%XJ .ITqܝKI0KsaHR-9KRp 3>KP ~;hO䷳wWH'BiԺ\ z;etd.z<81Ho $hU-aV`;+R I:kden <=ro y6/{r:-A`B<UX-zg &ye4zl5ͭ6Bͷbρ12^L8zc#2$5AZX,^vGEwo> HKB9U `˽)7 ޏG&\=UӃq))zSջ-=ZiyRTVmk&Gk0^H1 N9ťoW12#XP1g&k9tص#vgV&V=_?լO\Ե~ F iGJ벢 >OP fh/#HHHLBDN6hBс˾" \H☁ 3e  D9#ȁd1m`Kc"ө : ĄH[XA`ICJ- a#ֱ6phaZi^4)橝R^% dYËRw7'WpT ZdsϩKɩfK}N4K*_cNŔPAۓֶkKF?(9@Jݎ :q|0N̪Uu?xT%Bx;!(x$Q _ѫ/F,B0#BXYL^jʈhA #(H8H濴r^/ڸ0A6Ǫs]= tx|𲐿;\bvy:\Ѭ; m#VO P`y15ܫ XI=-6 BôdKܶ pj xOFSvT9QxGm奰!jI*]|&74N yi*Q_0Ŷ6͆-}.9Jg0Pl=ϬyJ߱aV{3hwCު(2)DN=fѰӮTcimH'U܌~Sr΀򪤨IwpRb=Cy.JMVq5 SPGNQItHR 15N0.-CQi*mEzJ*$6cg);AE'|TĮxJX C_d -,+UZ]`κzk zύu];JA *,/,Qϟ`d[ *ydzT@e,:u~?6 *XJbҹ} d]+!6s[]KxMLryå n3jg>NLj9ԧ/Xp4N=PmW 65'K#CK,`K]m[]ށw2ܫFn$#=+{KezU/^Ս܆EhBE,#*I}5R"5@$ӑh"lel>t܇"@)Jj{\<.Pt <] NRN/-ZLZi%zҥmɼQ%Rlt/ST5O6Mߍ󹹛~rܾ{}힟 ӘG"$jM̨3hA(R :J 4h4 A+w/s_d"yHegLԮN2R^Kb{LuYOg1XI@ah3lqX ';EtTDuzEwV W9ByYyJ[\[sf5yJ(RJ:S_J^U*{?IeZ^uOgR*Vu{ŋ82EK]>6ITIM`ŤlΒ0 OuvиmME6ҏz T ;U^BϪ-W¤vڪ 4g˟5nL\Xnk7㛩v#.+>s"7f)n|A @]uǍ ZasJϙD.ұ*! qcj >"rV :bلty n5Xrn1 ^,yވ RQ%hdIwnd\+ԄkQ {V<QN A,a*5(K)n8z|sOgGjV¼v\>XvHD_Eɤ]/ÿ5׻) ԧWMeGswV}!d)NpTQ*HIjƹta4SRvGC:Jn e|x:* uoӅSH7fE,,Nj Ũo33٧p_ތKi JȆ9 ~\cc[`I9_Ȗ  _krh'KKke.27l]\ૢ3 n:%̵͠]IpNRB8{F"r6FgRP;3IYz7W  ! H b;*4LW{ȾUKt" G012ШLIŋo U.&7KRiA/ͷvBM#_?M0yM@CF%QBHGl'17i8*AG-iƵJ|RȞaH8ݺ5K(f3:T,uNSeg3>sGǿ}w~oD.7kv` r\gSO&vCs0lh.C6W]m.60CQxf\f@ J|3'.݌SJ!Z,xm'SC"נaW`ڀ7t\JZ0+kYs$lvD(}y$jr) {5eM3r͟RK-mBw1XPع "}D @b*<Zh6mMOo &tj/\|Je91C*G֏UpCjGnRʢsz=ӫL,RVҧ, n`2${MӚ{#M y%&VT+PnSF 9Q+/pľj5"lTKRRQ5mg99"٢jmu%rq-`eEvY[ob3~mv;M=pss|9m|ʎ6.u^~xR*/Ad/E ݍA,~O\5|v1e\aLx=9&g8!-4wrq!*p^}am/^@-6yPAĆ\`Pm,~X$5Ct\[ FPZY[%5{-~{m,^Zo1I3L" C6(hCX/5gLw%qG jȎmFȋO!P }zspDr4m[%ْ8GuWꪯ<4 %$Uc̏S¼3By56MQd2HYOЁy mDj&â%g=AgDlFTF@k^+Dɐh:^2#k~ K)]g%OhߘiZ̢цU=Ei3?B754<6YiK.0?Bz2/0Mi:Wbe^Ĕ|>k?Bڹ} <ʤHD-@)hZȚBV^q*fઓ ,$(MB^ GⓋVK#ac& .'P+- <HS9 6-gƘЃc{jò4Li˧ddDҖ1)takk"GLjs1'P|;J+p)g mxrIx>P^ͼz唁1Y zV򈟟Bp:aBv w  + ^މgS"kV2L;2m}ts!Fl =lu;gRH[EWLY#" B`y,J}}e_wl)J UAg:3O>R gy+Y%L(WGPQ?^)V9-(, UnB1)`[s.!0ft{I9 'G@^{'UhDtKPXyy$_9B7>e@-Kf[gc~%4Ae34>EMPZ+:{v?S(`rս)`YENJWVET_ȚBګ~>m̎傸UBsx)ʹC#OЃy+Do# -q#E:pY =wQ M3,xpTΔ"gks cOЃyTo@Tˑ$d*&*QzD1)0p +zK&L qY1/ƕH W\НyR*&DQ4gAX4UEBBgUȜըN)29R%2);B0^HE0&.4xH9[0B-7䎕ȁ+V*+ݲMڏDuS(`qejVBKi?e \#bH3rP<\io1LY*a"J[S(`ޛjEAw5 ,[SmBlG =娭gP fR>K]F@W{C̅TfDͦ т#Q =^+n9km,h0:S(t`>mTBPh~(U(4*LU(*ì; 5(R/V/>Ͽ%kM$E#},X9])<\#G9 "upJR&RVlRx)"+*'[}Ԝ p݈j7rU+Cyz a>azd>n7o~v ?|Yx`IB҆N~OVZ~~5f-L8j[G[>xzb2._2S/ ID٢ԙl%q3ԧ1;f}2PeKvBbla~hrs.4qt޿ \ee0,m4.ӱ يs,wRZV/-CI6⨏/?qi3EpVkBhL'#&@ w%œ3y>BUvB:, p*V|A1Ij!A st*ґCHc,)Zv1H+aB]5*6p/RDK>)?j@dCLT B*U"\2VojP?Q#*S l7QI!,^yr9H`kpCДX"ZΗCF 31i7ĕ 52OWʭpzwM45pTŐj$Ӱ#b8,C& g>Xyd+t*zYMpp~5%.}=tO/F!g`T30YƽLm7ԁAbJ´\֌nfҼlRPgoo/nmߡA.5kFzFhe2m֝SclNRrxS)H[DOR)ϔ EM!\Fj&O9rMSҡ[CR߅4`zyvz7> )3]0JYxU h/2ƓN:#Igbm)[F /n*j[FaHSp;/1geywر==_R׷pƖϩs_$+ժ)닫ŪVL<}v O}[?257"P# H3pe}=TD]]L0[ae|c[M2H=ЀnuKCx:{ݸ75\X^*%|ZQy-~ j{I[7}SE'կm"jo߅_Nm88]?NNӁP v^Tц#-w@]P\?$boI3Šˣ ~YGqgihy7wvT灭-Qv$35ʃMg^]<֪D=ZUاۇ",Uߢw%iE\9~iWeONt}§ NÚGNꛦAx&FW}?_uZ>Q9^7 ,WiԜ jY;𠥫kG-Ox/;l~;} Xun8Hl_j(M̿ݨ ކbԻћޮ?zӳӫ[v9arߨ{yufUS?ٛBr(X-ڌѱ8\^eňp@lgps>ٗӟO=W}O =gO|Xp\-G{`_@8`7Ӈ6o3zЦ'kOdyϸQHֳ Ho^|vԿżnSkyYR5"̲ e~mX$nAaVp'd~7 `SZTsKIO6WZ\k.gC b3r2rگޘ 66?2zDvy#!y>N/Κ6ُ߱ xK-X?}4p%DdY3)i@6&\+T,իշ[M'ވ(/YyVnzʼ)#we$ z[0eRҿ>C&|Ҹ<8}@<EAB:ٞY'>~br6t=Ԩʮ]>J+ʆ[u8u SDq.5Lݢ8dx8}iS{\uYq)88=}wb|]øgR'85UzC9UL`k32ϩKJFfEaR,Ϊط.SP7zoĊ‰%~wN7;ueɅwY]?`4镉9K*L;p&ѵgE;:K͚ԩ)?{WF ᗙƢݻnto݃yCS"5,ʶz1}"(ERRR$*EddN:fIX2@à!"=A) X{)p/sCLJ~TQ Ƒ0)9aaJDyy5Ez),Ey`8YjQԚY`'^8.hm*eQ?wkZ?o/? ˴ⷶ%&VV+Pnyd!87D%>Ѻ,/>QxdwNP~VY >u3+Qle֬}F &9GNh2|,' CO~TjDw`;$f$;ǧgk QXt  D ֟7^I8Ƅdr싖o-Q:QH(E3T9&(FISz{npZSHwwMZ鄂0\00PAAY&D4S3LU0Ur5wrLk! 2M4 ;k!HGsnwh+Ua'X)*-8Z`%E8|D*D|ah 1v=<7Kn6-L|})EFa/yL! 9j6HOJIO"奐hd9e*EʙAPI( +&-Ȉ&頾FT8G"W BJ94,gR/J>/pc C ;4Р#$<3h +1ht\ g`K^rB:gR! ւ)KY , ,(fZ٠z;5v%F D `q+Ⱥ epy΃AS/!S_/Y]$v,"m41P*p1HDB8(PŖS*8a X yT/m<"@[:CE"|X2 _kjV9j$N!Nki0LMJcJٸGc 7=!OY:)unS;UiU@"˵ }lܞbBBd6.ܰm '|vXLΚ/YCC\RTܓLLS^9@6NQmGsRzȰ, -J))$$#~\G/f8zFI09; ZuJҧ*Rع[uPP>xnrOR,hvPřqf37BMujy:tSLa z3KVXT{EazY5i0evi9+v`d8[`B<UXPUαkag!^ˈiDk45i/ ٛgoџl?y0/^Vǃ7!$E5j SW}0arX{+%R.wf_l*GggVm;&7;vf_Sh'e |wr]6YJFHq|HBGm7 0 ƜRE^rkKSmv:Uf;mf{&_Fr֯Vӕ.$hr !a\,2dL+ jbKFb$S, !U0CFbu,u8-{I%]JQKbѩ(WUͯZ: u.σƯNv9Ègٟ| ms }oNh L(4yd@V;?lOpw8Ն_yZ+ٚZm%f.5r`.` eP/ mIߞvIOARj)Ϛyg Ϟ.0E@6xbf99aOOל6Zu1KBz݇)^htPM ,s)+8? T6J!^ؠL*ΙARLOd~RN3, ~=-\y`I e\~k*uq3 #W_Y*vKm XEwTV(eiPVJ*v9}$z|ok\\HHCUgAA%VIh>Nn[Az]3>Z~/U~W)AίUvEwm0z}  Ӡ hfruy|qp;IN>ú<4?o^wDBߙK׽k\H9V3W8,qR}kC. HѦQj!;JIZNLLj9+j$YyN?Wt(Kck$yF;L{5Ӷ=SP5JhE},湈b&I -tJunDXĞJOo<7a^菣n#d3y7eĒ42$a3JrSI ΣDDVKM)PViǬQFYJy>)a8z<˜739[f"#kid P0󵋬+컹6!l[n:t{=P.Sd!Kp3 \EDeWAJq.rX͔T#PA  7))nY^G*#R"%j"ʚ03,* ́Y!ʓH&6W8+C㞇qЮj7a֭_hEP/TxELpYz4š`a{8A(h{avzRiAb]qDɤ\<]pAZ#TT6"N:iEҨr Ԗv WYcU"&v1%a+ {3B$\%y &_!idxqQڸoӥSy"HyUQ> +S~ v=F"rrNΥSwb2\I9e$tk+a[,6`M@ZXuHAbhN &Y0%1׉`Tp;6RŌoߞή%/Ԝͧ!M#}ՃI"{i P!azv*a5uf_+woՍofͅ!1-:YfvQ6fV9_|jGa!5#]u CfU>0Tk`Ť'm;ݘëli9*AGmkge9O.u}S#~O`Ro~Ο`fݰNIyg`yCskh*yۖvҦӝxo9vî~x헉#V-| '똭nteϦ?ezl¯Yq譕 ޸Y?ӫ_!#!/KW;t(t&XVAȌ2'Ơ/M|s{wd^8SoK_č3!;mcyiLrPY l"JIBU9:xMwt[PW)ZQ'&]Ⱎ-8VI~&ųvw?*_@&:˴AL-mq{,liDˌ@WHr0fmRB(OUa7@#}h,Xh$g Rdܣ &NO+JGW6xueR{VxXvWk0y2HLŷ?TRۄרq<\ι/u.DI kTTM^8B"frZ +w>b1~y_c)>J"~68eV:~F=u`q6=u JKOBQSPi] {XnFӉeNRc ]:`;"2,E>^|DQkA(YrbYF$pZ{42}fŽ6[{^/u#w]^ۇn>=wJl9nRZbFw{N|[ ̊.kr$`%6^ 77Yzφ3ZꊎjQt M{r|6j(ާpG7|F2`6<`yMB7ő ; vX&z^~l6֫)tQ/2auM ,"+Y:bA1~KNI˭m7M. mGf#, u1wZi*߲0S«0G[(ZѻNF|A)TE4qF+Y/w"Z޻^g`ӣ??F '>v/;< ]ȣ>¿CmnV@[+zjl:rsΨ&,;\`qPw,Tuxُ;i燤ӤH]ըSc/:a$. :{:5#\gkl^~^:q:['t;90y#JK&C|18~ρS曺PFt iMvbkw.Y 4@az1XA^aA?|,\. @j6eL|kBw{3Ug|^9 &%VYXf.ч@ L2GsNґ0VmuJH}di w#Q!AQlZ S9SRLW.) tf^N̲ä巘@ ɍR>R>ak"gT,[%$fMⷀSs+ŝ)$6=~.>rUMV/R[c!0<4pFV]ѧhsoR\KϹI氆U(I ?Ͼ!h~πmsP>z:k:`}ned!)^ڠ!ɄuYKtƔ[CM4+$,+m<8(%cʁ+dHbbE5ZBz~±x_Cžl"5k2dYΘ6@5)emD.b@ I'͍L%xFzE@!ce5IkvHI;k ;P(6/֫^YltE v ]8RUZ.pM}ll\``] ےbZBk}M'C0Zr,fQajbFD?S 3UOv4~dr^t sWw^`/8Y zȭͫ[_B A_ ^pzYǸ+Vޯ|R?yڮv>ܗw^&-z=x~#G⹸Gnƣ".e2XbC|W=e)}xwooffv>PW 0sky(g[͏䛓sLR4 tFEBE*Y,v(ZUrj(RO7DZO3tZ٠5 8m0 RՄTjOH0.G 2i0لRahm&;dpb"Ι+2Zpk%F:erp7Wlp'.#߿Xr{tTŲ:{XnL٫xӋ%Mk&r.BBBf )_H"v\H`bOn`t\`w:9^zoSw9$Eas!jI22_Y[Y@K^.pl嫅m {ާǍm+gyϒ<`ٜyr83B-S?,T*ʕo+pu^*߃>΂E믃_Vd$3$T8`J~er_i0C#&Noz헾M y i9B>5]5gIWu'*Jj*kq6ꪐk̹B;yTIT4uyԕ ]ƝOsw.ꊨ]m${P頩ϨAˣʳ#u<?˿9g{+ۜVcsʿ+NF+419;[TT@^ʡOe})Ve!O3^J ?6҆jryZ.viցs2.2ĨI,쵄RRg2 BJq ,KdCh[LgZ8BK['\'m4((Pvgg%Cţ($Qű%V{{d,{mT+SvH6`7fWS߽۩?7wۍ<c x*zPSUriwUZ{K%ko"J7! Ŋ>U2U\#:t g=i>],D v?3h`g[ ;ʣAq%'.ltQRV(^Gg&pE 2)hx޵Kdv!%I`B<\pQJ&!-rzgIrṿX_=(VlYI*RCM˻E<Ŗn5rᒔd>0%3`?DUuKq2fEG zbP7LA| Z ղL&*AY'KRwR h.`rA,Xnm>>bV̺5i$"]WNŃ6cAAJLh`#! jz.մ # g-$,)hc0.HR3ԻDѽF?gW~/? .FU=y?=7;("Bt9 }8 pN14mI37M#lfT>&jE;Xvhpp͜tB6*Y4ky}zIɗ5_Gc(ʽXy̏xo4K#MrSC4/Έ?|/~|Op'Iv`Pu5.Ad[cwۇvZSKvZ e^#7{}y@fHO.{?L'PZ__Rsuq]љd *zzi+lbeoQrZ8BM~KRCuUcOmtGuVT5ʹYy;M0\$/P𓦃?f *¯ X1蝓jt֏pVnxٺ҃01۔S[2Ќ+#c茺,sU|r!diN?<*׮{7>vJcs XPdCe) sH K;jjˋ=ɷQo=1Mr"l IALEh{u|'|ـ=/RG{W)Zokl! Vy3R(E gwH@+OiP'-޺W'kiʂl JtNtJ \ 1 IRHRv$BR|n&qqCI1ē`$ 'k%I) Bf~I E-ä֖s?L"8VL~PwxUgݱP"\})ZnN~gscϿ+wju3QO&GsK伏IIGX"mSЮHu~;Goowp<uWAKY|q jʸv7koe m*̛Z4FJ1ȕUFpFu#gу8FFo.gB&y J3R,dN@Vg`Id&#(5$dm,I$GH&F9ZDEϘ Tv*`d-rhr}k˹zH-pT/<\VN8|59Xl[Wq`thi dV@P)3BBO%9qlpY85έՌKK7Ķh)qAc C9D)Sr ,jtNg ”ȝQDV‹t4"k:c"sx2~'[.JOCb$<OIhc J A>!g Woz9O/L !`F(U"ŒwtYP*ϩZA-:;‚5P-g 4Mb@J5LɁ6:Ad'8Z M`fwQZzPF1\If2K1TqT<L̀Xĥ<"F^[EOZ#2juvq= !"Kev%6pK-;6"3 \Ldu(EX FׇQcCkt^Nb08ѐR͇HE@BlB&R(~4/p٠$Lv´Rڙ*;,/HYՅ?M`8G?m`:R`~Q,f*%p#%@h@R'%ХN}4ȥ7B[5$7H0utVdU k x9Yh鯰2Hnm9C[- }H{TX+HV_JmotηnBnDֳࣃ/1zM&˂z5-rݫ0idLz˶J޵l[O^ܶA]x>eiFrsUiTEtlh4ZEg,w/_nt֥A&iBGrB,T,\^*V-B1XP^S͓FN'=mV2i=mO 式1 c.:&1AB.2KeN< %qy#!q ݧA.W9'L&a XR%2Z5d*Z,hڂn*f(g9M̲I^] H|c9,NZXChޗ3"~8_RgPbu_c$]\d?nSMPb#*ܚV4r@2ͥ,(6eE \*XL8*d9hC+q׎ܵ\Qye uf^dO'0ڼU F_ɴ3g+ @I0i,JVd3a|ʣ&fֵ!LNtZ qXNsE)W\A%fJ ZbB Ĕ5~±hCh8:x w1cTJr2Y%ra"8 K 2X@DPmX[㴎:N{JN{L̹lu{=y-v{N݃l0s ;`w /@6/7.J2saz|zFftOUU"i 8,v_kĂ:Z_v*ƣD@1NO@ykTo5<;xzzC-Ig7|}Aon_.@(e2FIJXU5g]L2af/0 oDP]b !3ZzՀyA>~xN1AEvsfw!ZoZˁϸw ZތƤݘw~2F8[&Z-޵q$v~? =8ЏjkTD*}QĦHGtMOUׯ`) u&:k6ss])D7/uaw[qR7a'Tl3ႅJ'IQY(cM y  e;[xLf !5Cg]7ϚMÂת_rjP{Ϩ`&تgWD=Z`RW++O%C1>QFJ%"fry1(^&15NIl 0'"5ce4љ c=~]s *ƅA:DF e%`9m)(Ȯ1l(ZT&o ,́P:28l@}2j͘ ;+,ȹ_Y/@/GI7>lkagRS4G8BwʭvE,dYCq;(`9<4XOd#h>')j9V˱Zj9*J+q4A+RE%Z& b+ID}Qܹf{&A-T_X[b_!e>}3nW̘,s6oj;˷]7|}q;/]}3g|oxn~ύg˵hKO XsK];jӒmT}/MVZ˭ ͘3Q-3Qs&jD͙95gBm Dʚ3Q1U͙95gLԜ3Qs&^"cz'-5gFȗRs&jD͙95gLԜwQs&q{Nw3>һ\eKL9һJ+jݯ[S:xUSjWM)^5ūx7WF-΍r|/;$)8PL##o%Lo`e6aKL[w=Yy4%Ɠy,'s2Dn$%#$cҙj"MXqM=6eMU\,+ή?5n_Ӆ`hЄGJަڰH>rdXø9::fuOLUyHᝡ h͞5:/_tMֿý7P;modhʹ*nY/) 7l3N1, "6BXDRҧn{ftEN6 vWljKS$09n.."U`))S ViͶYb͖F>u=qk^B廸&%!6UL;9dfG M ZVsK R9$8& V"wV< "h\p"$,KF,Z ̓ 22) aih08,a֍_YõFn"dHC=} P(xD#(J5m"HqB]I2JMYDʃS&HlA{v! RYA}$F&]V\P'>:@8u0>=l'r S) .גޟ ]/0p:{aƽM{u2?zJ`lyD|識6a4{0J('j9ȍp\Ϗ``ٹO8SXm\#Ȃ'NZ$ '|JՇ0rbϽ .%Y™}3kvv-zxr~q|J6ɾGapz04Uv ]8C'P?nD-Fg?%$,}> Wc\R/{ď퍦V|SAT!OczpzgȎ'߽woNN eޞ j? L T$ȭIe;ВǛ͍dhYgɸ#w{[| Ջq[ ?~a?y#E.MkVq?\c/?dO^ݢTmy/v 1 ݸ nw#m66uV1[S(%7͋|qF߬vtm9 ϐ`@ ~d./K7>a4R)}F]y겿qIJ!-޲ģ P/M]5;2_VԿ? Ҝ}&G/bj2KVᔆ20 g@J9OW7lBaru%vꭇ:F -Bo.OL| qm>$+5j?V+C[}! ;6瓏.QA2H5WƄd 脦ZԬZ{8¦:D5)> LxE.\D{)23s^laS4(9[$)=&AUI 7 Sk y MMx*_q"͍6͆ϚLqÞ:[ӕ ]z|Ti)(1K|,o M$*x%l#bJk#iUfwҧv]zqar"O)VZIz B|+e [B,cj/NiN~s?k!U@/Xpig=gHWJA[Z%7VIv5''b"h4Z(Ҁ?Ixb ̭D_W9;ʉRL:fH[\:2i^3K"7fȆfȤ&, fY{XF 6Jy簼#gC9;afFӇJ@C$DElET^S,XhE`s PB"PR~)?[7XD`,1V ֓ ]): :lXV1&j'U;i%2P㱄cIS"1@;Mz'>P3BqG3x]8,U'^Hd1c0UyDɛ+c (,|]M1z ?\h*(YIZ(EЊ"pH0>AWQlAXECbX鐅C~NЫqV ;[%M{ɢ qnD;$`=0W46OɆFI*µ Dj3*R \ @I*@Zp^[Nr&)cV5eTZƉHxmQhԻEi Gc  h`[FE ZAdy CbVC %КF^D;\J1toA%!Ǘyz7Z߫+::\-.bߝqiZS4DJr@& pV!F- \ +2u),=zcGCYT.\Už4@{h=n"]Ed koWخzٵ>7!BC:n~'].W?}7~> ;Rw*-*t=[<1Q:eaTm0,uZגjA[Kອl+*@Zn'y2Ot[ [@]Lli9 UͫiSͫ;7~p*xG8]~ފ`T`W4B3.%yn+r`O8?V wKlGr-\)‘< e+gAwpz=W:'x%x wゟqKBgdN $/s-箵/jvmvP7bMozvi8`g{ir8.q۟M[s]M xurcrAjnTg ^gWo$w/&Y!7e.1ʳa"$γ$U ^jncʧ˧'[Zɠ]Zd,j(A'5\[- (iI}Q*tHG t>oș.51{ ×[ VKmOv~5S bf ǍiĤXCf5R($>VSK8K ܻe]q{Ϸz|3 <ɽ #|ۆkWg$YD4P21C7} {\PHV'S.yg˹9["]հvH\*|8[M"YF) L'#qx$V;K #IB_n}ŀq96؋/._e)KRu8j4,˜fwMw=U=T %r6#H#u񼉽CRT (jxT~8!,FʩR[)r ^?rɯQ1Ɏ ~s|zShaWam|NˣVN=\l{JY;[sZoyxd>Z7`SNq飶ěU`̙`?\R\yZHm~N"[^k.&Hfj:ɯf8^ UfB 'ZK(Iv yIb{t$LEt!"'U}@e_"88f`3 pG0 ('׆Tl#b Ɂ{b=z iA8XG!1du4,&9bˁ#i=zia{łג:f6|GWtxޓ-DPl1KYo r|z~ajNh L(4kP2 3L_+ǂ]?OWL7%}Tg:#Z &p ˆ^FcD2Y+냉3kʈhA #(H8HgcP׳^ؖ=_ݠ76|8Iў:#v6}.ˡ&/osyYe=nz%nd =!"}J8X{))%J7Pb%0lɁ{t;%J|=8}~|[[o&]"Zlu Ӭಌ{dAfκGh  O ۃtcjQ+ItPr ]@hBf9l\th}}[9-: 6dyXmd*ڄD xNQG$Iq7uޡhϻmwL5̺Ew eV{ ih[X)JMTELqhn6 ԸzNPiY'!rD'}kU\CE+Dw(/*OL'F_.;тMq:8E6 UB e0:+ N;/ S5 ρFLpvT7n\Ira۪ܯ>ڋT{q$0&.@0i:LIIe@/ ;/=O5Knr~pDR+OԿ ӛ9|_Qy,img]}l4n  0]Է("?5&vZJbM6B>YT++#9RP::nLnm[k~fjc\gp~յb~ogRn"` .,ɉAFjY{;-c1sY/.&H=YOX}-((PFSF`J_m1F: lp6H' vVk/;^sXes%NWk QK;.'(hn)򟴤rTrsY X bo,tօ3b$96,$PυR|cg` Y 􌱽m9G6}.@0/zM\ؤڂ-`KcX&zGF19_;2KNV%fM!yDrm?ٶk{QSZ| [ңҥ0]ySH/)F݇:6Lri n\jm>>_F-$ذ]v{{mqox Ѻ }sЯ7 u۲9\rwm1 ҂usk|w2,[zf1qcm". ʣ6Z:F.R(_2%n[ 39o|΍c~qyi !V7M4fWÓJdҎoOow&^ 9\킨ڥD4\ I1Urܙkw3A5njPSמT.TE_Áp/ vr8-=>Mq$w"I0L8A$$>MrV]{Ayu*d$< a`9pK(8Wd k,)PV8dY_}QS1S315I$_).!F%^aWC|VS_y_3,sWYg~!_'>ZcVVd"^@ܖu`ax0BbM1ƓϻV_:{?Q)C0!B*Caj3ւg2b=6MVHKDbgY'?JH&Lz"N@Gv_$p242\.7y_geHj^X ]QQ8I(eo˽)u\lh8rɯ"|뱤`Ϧ'`G}4Lmotr^bTPZJeOezY->_.V9_d>ZR*" )Q[M*Ffd j0D.)G-*ͯIdKUbLfO~ՠ0Kۀx[@OP fh/#HHLBDN6`eہ˾ E.qVq g `9AQN!610+Cʘt6z=뀹Qm *qӮ=M!4gMDIKs\fx͋̒-q(q%T8 yӻL, X,-,xGI4 Gr&9 4^BtQ򢢨Yatpl-X,ۄ#XDlp\E/4\&3驳"BJ0Y-Xna7WcIM7vM$BްmQsWv {zSD)Jf~4S%"WF|+r|g.p_xk]= b CTLk*`:fah©:'S~yP28E<X0tab92>z1Ub=*-=8$*8ʝ7\ 1qV{`:FY^1Tk%i}8U^@ZQOzҡZ;~0яrAR^o| zĖ ĭfpޓ-q$+yȺ;Q0Ȓ0"J'7qXEr.BBBQy̓"+"hDž$ &tM^xu#*xܣoѩw9 Fa3zn*j鉨rd2Wg=+g=:yI:'K_80D0&ny:ŤrOd sf #C L5kI ,I:ͱ;@5G-HhSFkYڂ )aEdp(1,(# 1ox L~ߨ' frm-qªb5g˟ݺNv]-M+f{Ajnn/Z+;׻`M_[w-u&^i  $XmeHj4goT '$I-Xd٩ce+oHJET8ԁbb3.8:gGxG#< WG1#0YwvPD6r:^}֞`lLfLl4b4N.fU& [E;XvlGsUsv鬂ue.]QLg(L}6eq+.VbPM8d`ry˛ׯ~u˓~oN'O/v`F-Am ?mͧV4ljiaUUO7|ym%/.Xn m?O֖|򏗃cK8լg)N3sK?!Kq:D nCeƽxZbIx.x㾤9)׋:Ji[| mS0g鴃f8<΄V/ueo3AeB@k(J-5_ч_׭E^;tQMʩe NdtT5HtwJc[GfiICzs:iL8Ӷud 1y<"W. U uXmkۜ3Y9Vn:4z&VZZt GoXX9N,kmވԁ0c\#Mp"'"3`͎/Iy@9tdVت)xE|}쟟ӻi@جM{<ӆ55;P5$} Hƾ4{7$'VB}1KoCFZޑb R'2kt|MIt-ƂSO8-Z=W2J`.meuccMX.+ݕ>P&Lc\Y_@L[.Q\.A#n( cbY`q.2dZIJd6Jys 0_cL[B 1'sHk@z!C >Uҳ|g\*.Dmdr=*T,mQ:vc0=,f<9DHfIOjېTsяG[.KRByaY>8oI$RsHXPVV3ϲF4_R"Pڞ&2'JǤQdel: tV tv+.J\$ Fc:wQf\JJew4 $tcH& |z.?e=?=0(ABd 4 hO!"`Fr*c@h@rDjVvzR' ?"bLx ,Dd)1ƘB|ɔo1*㠒c3IY.,JJ htfXQ Ҏ'ŢuPmGm,"[I,MeAC*,`aJ!c, I0]99' ,ڐH؊1:aCCU5A/G`tyuio7Y QBff ]#~+؄ׅ8(Km(a +}n#E&)!y]^^1뵍[B)dFM BYϨ3:)=N{FM F8l4B @/3ţ)<ڤyK:$Zɐ{ռi1$`r0Dǽ: G?fIZEҳnO$y\>.ȧMGZѴ ȨQz?26'!:8K'sR9  r-:XЕӕ.΀F! l<1T<,f0Ar2yQmpu\wq6sm+o{l*bӠYY;vOO0kytN3G5ŘhwJIEpAq=cy2' ".bD`^z[H4P=khz۔M@ܥ9k/̪_w[^}\@rEAұQ{c(C_|t 6Fn۴o%gxUAZea ;ĶYHч@lRr&s;:T%F.*sMp&*rN10VE$)OMQ1g 9Ce[|'JN3Qe+wo <"#֒NT $X٥] \i܋rrlWÓ:EeJ Hl{,q0OĢǷ\ȝpr7Ǽzb1f}`\ߊ}D`m&U9#i9+%a)K.J|*2A<3xGs-= H6gzXQJZFwccQY<%쓋r^3hOnhi4~\Ju,[D,O Hm!ɤ0tEz-œVZ:>6>8 AG'R3DCĔE-PXD,-)+I@MjP_Gc=#s3 8 $w${ \)-F͍RK#KӅ4ĈQZGiҞ4(5KrFc}s[&e<h9z7x1"DQkAPR܆Xb&q\8-ugY0M{<lw/$^0'ku/ru~dⅅ1^:Xz_UCUGڰyvH ?JڶWdzhs?Y˝.wMr}k0%&ЊЊN_ ~L /LB}qohNVZ;p3]2di!buIZ-cx:hj7{wcJ#z/[Zŀm杺PiEY"j}=}-VnC+~ށPԴ?q8VrT~ryڥ$$Pra-r\ېNni!k\^k ǧ]xE _TzI#r?_jFdS9ܑ9' Ł$M"͚ߣX2.FJNF)<# # #OHgຑn)PxFnyۃ.%lrafs_]DIZy=>r- TbV4BLwEnwAKj^kq4zdƳ}gA,R''b''hƣ2&gǴr˒*V*>g%;!%VUu>IRyΧȲR.P(?_aa2׎uFn.tJ'4S甶N|u79ߴY HЇU]/I ,bѧ4il@β]m`6>H>EK}R{G,@JH!ȅ9:`LΣt@|Fi5l^ls ?P>y2^_0n9!j!!HT^/GG߂voH=$Į lxvt\`w:9^=I)ֻ٢9yn $T,-qx<w^|>Wrq_Mav6ż{][G,Q[mEnXrd>yhSS d>BB2kK$\o9co9Y;<Z0"uY8f=[%Ry;vr\\7t=^䅚B u˘JڼǼΔ^YM(F֖i2I!&霹,ҪZI笑9fa :?:-~Dc>]dͽb.m(Cq,S+҇R-̍":ENFJ 3MZKU݈Akă֞DAs:w@sv=cdyRLa\"#1a2kmHH4AlMhTWrUwxϷ+-Y{|7?\J-g=_, qlFJ\E?)Iy ?FE_)^e>eؒ .6'zfNشZ;n()&8㒉w݅Rrt˻錶wiųWY@E,ps:cƦr-[rZ~XtP(f|(}n}uӗkn%Uvjo9]]߫H$ϖRpF4M3p}7It%4pZٽ-Z͛7#}ۺ9C໛!1}0|e[G1觛YGK;LoLH6t5aVYgØFv,X4x4\ݘlU.'5j\bvi9HK )+b'XϊGg]t |Kx_3\_;}>?wuD+0KXN:IP&#=CZZCxˡ}V5gX}Ƶ51C܎լIO77L\ey f{St&YŃv=gq#D6< h0=E-U_{{x-x _Ns'׫6V9'똭fgpyn@{ p?t9C\֫d~ȊAoThK~⧗?%'ۨC^;t(t&XVAȌ2'Ơ3.ި{irnZQ~98E7΄m]r d19bR(2dCe)4+&!i] ^'x)8u/gOo9GQ'N8 Ztڀx[ii%?XhrBCc 8hќ CKdR^j %:'XfD.HŘm*JI (!ZSGѧqB[jƟFnݜ*ܯKř+mTg8u.^⠢u.DI kT;O^8B"f2Z' ٝb6|3ۯ67(W`#7篨A!ثtВ^KB^Pil_+t~[)yJ7jBV7`%6@mc= )PElzˁ C98<ǪHS`.;2'~.h'"WtO~@f,E-;l*q0_DžS+hpeie.k+kR}^!ۗ46?lBb-|Ԃz }x27W!Jo<xvq8HblJE507Edy2 /fn n zs$A%C,) )U&uR(/7$n 'ڗ"9hB& 1' &(܌@řٍzJJ.'A^$lŽw!ψ}5 UCE)>*2hFI봒3)G٨'#}6KzV܆Pͤ~sٹdr\ I/E.yM & @K#cAI(ȪrY)Jgb- bcR#,0mJjrV=Bˆ0*=>$/ "Ș죋Q` ]r@22H1"PS>_l{GzaP28Ȃb4 XOѫb ˱ZՠΆIk95I^df‹Dv :r& LJ()5f߲8l'IW'H&]7XEi%&Z/A2Y4dIIi#A16S"c1Gh+9%V^TE8h]"~vY,⧾z@'ĮRN]%peO]%hiU+dWDI +aW \z2*A+DUR]BvE1br!,C}KEݗm[>F|!ӷ WO7MC=&X /00~\q *Ol`'ĥt4+p-teTF.4e'eaW \O]%h5k;(kdW\1פ/OХ~緟~mO3Za~$v>Mi9*=4`&12KxB8c6p|J6E;xn7 a8Q[!ś^lEfPIS* . :K{odғQ␠CRwkT@r6nꅂ+[ծb3Rix_hT ʤj)$Ť o Z3cp}PڃBmd>6 @%kB06kg֤2U;@H ^ wE%Kf%WR?iД*W5ssi}p1o~]]H PzNnTp77nʃn puPA"ćq^h:WqZ7 ) G`A ƜL*7gvΘQK8(Cmr^PC۪tnW/JWG= F_r\ JB:`^e) Ӂi]"U;fAqFK^"88f`B Gb9AQ.$@pX9h`uPMrPؚ˯^i:8XG쩗DH[XA`IC* a#`䠱'zGiuvHJ;i ;Q`Z#ߕp] *d.H$)CMKy2(JJ VWJP.U Per62,w%R||%GȿU ; *2G煒.5b'8*S4 @O`tqEE[ck\]EfkӇEɥ ی@ i#X`T9F$㑅H>&0+Cʘtu }}ࢪL:f6wA ͬGsqڕ~p\]FYettJE*L< SDc)Q"!@%K9 Q Ƒt|G iI[R.c2h`ig$bhjy_فxe{ҿ;|2;xbYw@WoLXQ<3Q,CM}ԀһJb'm@=JVv;00#$msD)FQUt\M\nqPvguaI%M|sS$/naGgH9r=Z|,qo'gar˜Igp3IaY<0Hu,aQ{K|2w#zQNDM㸓x q`5ţq'$9NzUL~SAEiQewp88}t7 mŒ,"Q8DL`YTpy!fx n .5&7mv 7Z통M^)B+BJZ2$Q9Pp.`9VL7-s-" ,򅖜N)bxy2Mq) }]n' ҵ.Zmm]g]$z5N:Jׇk[ LKȥKu}MŅ9jX*Xb҅VS bePNE#Hg ZGK!S1f4j|sWL#,8MuhE:ƽ9Y&=C~Qr aog锱tw-^mZS ߦɸK|Ѩr7z4,p a #- u*#]FN5tԑXoBC8ep(rFR8cԸgȕÄ.+O9BQaE֓%Gǩ)|c4?QYKkڡ{];_eKX~w=_0& B" { 29/#1@T1VYE8EleF͝QFGR*:d2Sp6pSz h4 ܋{'K_ۺK-O?B >lSL+6ey$RrՁ_Oԃ #2C m/U G)HFPg9vcg9rV ASgWBHU$INEE[Bd4bUu> 76MUO7&)B۳TTy`9[tFSPwˊl͵ǭ{<]y18Gl楑!nOY~ߩ='2qm,E>p\zdXlCφn-kNd57~s=v?Jrl#M3JV?i}-u#mA)Z:VX%C:nc~H;܎E+ޏ[ŮL?8-}\:<|YOr10Ȓ(UF$ ~VIƵ2NJMv .hŀu+¥`cB"ZFe)AHKȳ.lby h饊G +@O#ۻ!/AOvk@24{3[)w9NFR$ VQyK/HIjƹta4SRvGC_uPm *׊63?oٌTxGDJNcRɜaQ`N+!H&hp q/0Lqm.Jw4b&֝_hE/TxELpYr4  s+ǩBD;? wC׉M3aqD$\<]pAZ#TT6FtҊQ):BKHO) !^f}Vٹ?}.ɵ" m"g31D.¤7Z5 1YҠDqᇷ S5n2Sy"Pu V+SMv=F"rv.Swf2\Y96x}~*a[ ,6`M@ZXPT\\wxxq( ' |62> W*nLJ]qW=_ŋ޾}3y|Rs 6JѥgnAlL? +.ABϽ{OTM'7#Y1Ku䲾~Y9[e68_\Nw.9цNf8AjZI6+]7-CѴqeTdpT(IF5ɦӪqU6:{ȦUUr$櫖 #($-}64m +G >ýcI0c:Y8sK@ǿ|O?S?>c_ށ $~KskiXo4UlyΧ7|uUGnXw|u+撵" J?~n?KjGͪU?i!3*'+Gl i~sT[JI=w+@x)=6Fi .x|U-)S(FT&f8;y)?n~l}ϛƾ@.exPJR;)/n =m9.; Վ>-gy$j#,3;2EEڹ 9XLXji:v(Hpٻ6$W?O 4am4xl 5!%݋Y,JKRR,[+Yʈ"3<`UfL\i"Ը8s[r2ٖ3ɗVf)jV Vk0 [21jj{3To~r}5vCoü6ï_ ފ |?۫ Vzyӗ0X\)JE1F0ߢ_iio)g!Gs~0çyԇ 5=JeQcFQGQcFQcFQSb6j.37ȘOBĭ!as9A0VhK2Nbm7(?FydUt"BG$"ڃN 1Plfn81E2Pi,T.^*UʜMܽ-&KC } )eCxHxU)usNF qs.pͳYX DͰ־71"&wGI #b.#.bYIEHɬm"bJ)uP萬m5 瘴j~%hBf 1'&(_5v3sv#CT˨ M!ǢW E suP|V6%Jjg|F2#TĔyMS"6Qʹ꒾;&?(\2Bؗ ˢyM & NW #cAI*ϲyV4B ]BB%5LNTNǤQebs6V:lgWo^Whl`11xeɥt0\s.;e H2qB.Kq=B viAB͞B̐K΀$JN`!j5ZvByI+nƺ(5^dT"gBcLJ庰',4G1v;k,#}i%] LxtaXƳš8@J;EksԡĜ%ok7#]ڵ<l,׬-h0L)YC$\])99GЊ0dY!aBG({<ٟ1xGRPCw:F?٬MɃ;ؿZq(Hb# 7ټ"z٤/hݩD..(;PVw6+$F?&Nv'3 *d_5!hc gzȺ Tt)Jq!f\|N(L1@* yZ00YZokfnh{߃':qEnU.jGQQHa~d~ >b b:$9#9#9#9!9b&9Os$?Gs$?Gs$?Gs$?Gs$?Gs$?Gs$?Gs$?GIϑϑϑ[r'$?K.NF~+Uk窕V(?p@s$?Gs$?Gs$?Gs$?Gs$?Gs$?cDGsD@s$?Gs$?Gs$?Gs$?7R=\2ϑϑqҒ*ϑϑϑϑϑQDs$?Gs$?Gs$?Gs$?Gs$?Gs$?GsDQ]4ϑϑϑl1sp" Ï/tʆN}JU(vLu![_kX+O"9.VEiQr:) ^IM%eR֛R:ff ɠ`ۦ+kqAH1Ȑ -f6!S64XcJ@’-f6ì:ӓ={Ce% oey;rnKqp$t=ё1;:J o1v4'4 qo?-I@I)CiHtƗ=BFAFw mj.=[|vmGzxKgۡ׈궿׉Fhϕ{ɥY0ps~p&!3-#WR):;{n}N~lzgW_^mXk N\'!_!~/*j ?'iɴ)45n;@p}?چW O<;2-Uy*8-B>-TP~O0i+AFue'D ~:2d[ϕ7+wN{QScZdeY%M q+>gb\Z9FMVИ_X^x7ÞyvmA̶a}ϼ7p ejGԔN4>wICYuXbKڈ$s:1z, թZ?SwKl /:Չ4볾a~S+ۇS={reY",rs eHч2L$KN%<{dֻn5'g-O5&^JI10)VE볐UygQYC~mTCn 7x3ŦP2Ƿ=>yOnܪ{4)_]{O%owo!jW%* 9<`rU[vMmMI8T$u*V Z<U xGs-= | :263gwd܎R [B8jks{=gfq(8^ :?\\"v[)cO5Im!)1sA-rɖN('LÊ`Lp@)S٠3̢YXQtB|TfAeL[xwӗccǜi]A,*Q#FJf^p1k ۰ @FFvHO;i;SV~%Sߗ^SqJ9ΰ#1ͮ1wJiLZHVbW<+%K3 h*; J߇0Ӽ/S !%^)HkiD:91 C)AcA*`OHv VLR5S]RR9]epp vv=ɋo/ {;B%'E"P%C2Ad *8UedK)c,oK:g F9T1yeLc 2)O,8ƵY#K"=$wAn`;d‚ fݓcIL۶1Sfz0ض3zP(z^ujf 0 !.r;(Ev^F%73n+(ݙ`XyP[ݽ~{Dp'r٢/8.]uM,vu/'{Xj_ʀ孹("+}Fqrs!?zdQW#Ws8zj{둫P=X<b|T`]?x5y?ʳ`#%J+ڗvg75)Jg\)sQҕ%Q%]4+QI֟ GVmL8O 08BMuK3A!kOɩb'̪͍yYL=cp3,<7!6ǪB &''Qۜf&͌ۿPؒW/ MB34 /)*i+uR<n˒\2z}]Y{eνP&3M&o^9dIn4qY4L0|6M%sh*ETZ%z+hJ{>&R;_Ru٫G]9d3hJco9|||2~÷/g̹*Z㈛:/X^HZ]jز>7Bi1KYw)O-י#ňc)KQjbMHs*ZL-fu)BrkKhs֎*?LX,% XtD%)HC^qNƘEH/K!^` ɗ5 .'I$|( jBI|?f ^mTssxmŏڠxMxxZp^-xIky<>ʧWlʶި s}j4ȶ~MhCPVG?dX{XqMՙI(^ٮXϑ"+%V7zYȂ5PtҌ5K)t1L "UH$}R)JC.୷II!LHdIN$qgƿ[/aJ06O]߫Jg };j/=Ӳ\?wquR L4s2mvHMDBiDxvچND{HC|&'l^C39˻&EE&+"5YE:kf1:KRKD,V m%}< Ōۚ'yw'V|ja]@-f^zElE&<3llF'S2uLFUz鱶uI{'YPA6ڒPo2)xce;;9)R09Z{AV6=R5S!/u_>"y_r·4u(R=WvO0 ?\if]22MIU&'>o_V/d׋Nմt<%mп5dR=zOZ6jPRHjDdborjh/\3;6m^3oFBp&0d r r/0̂*qX}iET|18*_뵺 6:'61R$NW-$\Bӓ~\N&1 yhxn.WvQČ6-/vǯF#i}Uè8X92e1Ly|(^u'?. =sx44Xg θ)iߎv _?}7N'7l3,kbV񰂮iaY̏>qv\ʒxZbAlVۀi~)ͲqMS'E똭渔p;@z0MA1sYf5EQ zk#9Jg8ΔO)_Qe]2P*- $S($Q4`sӊvڼ#[LmJ eQx6Tel^$PCNBVxb6m-`3o6v *&&]܂&ORhsJ.1g*%Zw=ZGkƠ$xW[m>G \4Ԑeh0l4oUf/Mk%Qd냖:ު'⪙u_!8e}+ JňbfT =#}t\Ԫ x\DV9ym$:x-}T[usU:x ڴ<U-:SVkg,W+8KgN?eaɔJ)m7Lֹ*P"hWst]^zPxg fi){eQ{;^A}Fo÷;1V꾱h2SS6l꽉9qE o4Bl4#z@ELMC;\ڡ>0}`XZ}J{FB |R`ބƟ9YLіDh&(Ѳd!I.J)20dMdCxcř1P쌜L/P4EiKr#_>~l }'.:!!^0S bz4x"]UT%?uT 61M05Z8DS@V4í|S;TG+IR[(aQh3SFds)@D%eZT@v!pJIIu`G 5Ǯ;ZwF]e4d{2,tېePϦ0[ _l?+v`A0|T:asLZ&gq%6Iۙ6jNqIp/g \ Y1/%++"yGH ~ՂU~zo~Ө8v>EUs{gn-F/= ZO=hkinn=z"5<96i%6z4w@DkeX"4Ո?ٌAbmO'|4SՏD#D8a@%+Ș0S JH Z2[tG$%ӆ+tě s=eVx&ٕՁ)Yn{4J!w}G|xiQfr %$iH5o1oEǩ /?,y^ɽro P?vrq|B3=CO [ NXC>5y:?"s6X6XrNAK)XU,U)9*JZ4޹NݹX.$ywi\|iGPt))IQ Ude(Tκi%2Rn{f-M<(.f,5*eYR`-ۈS 6v9kIrOO/v*3>0TuqxJ`fDZXMT$)c?)q}"Ktom\!IbJVE@'[ >ɁpYZsXcuz`\Ek7[ڶֶ]kwBUmljeD_*[R)s42KBLEGn&G|.^tX C(kPT KRLd*R3$XdaMGZGZG)v;(~fYT_*г?u-:>鈘4Y'Mj q\.o>M=>Y"n~Nj'Kk T vX[6DB}y} ^~*"_h1Hy~@I. owr7J׋0y5{PY~]B^|o p.dU,6zLcU`veBUtO׀ X^\ԽvfO^4v{w>]߶ȯtoE] %\L92Ǽi2p.}mY\w0`ޱ͗Ė?o K{?;\ /|by:n:nfa()ͅ'jqp UR4:HBI6 9%([n  s}E< /h=?Ƥ\^쟦a\"z sSG;^:.5|sDz牁AΠw uLhm*AϷ]-KZnNZn5sqGRUI47zwH2Ѱ}<<#ѰeNîW|&$lVqK*}*I/ؤio &=iR͢EZbZ`\ fļ9tt6k)_u}`kp#Zv:MvlɕL.άV-,@Q*R"oPK4q`NZgiфD , E=Q,] hCu}ξm[:˨41TO2aD* Ϊ:jk< r|OztBk{Ygޟ CwߩDTk'tr/ CX}ݝS_L^/Sd(`PbAge,|7IhJx*Yw yݳ`{F1XeP rBj~)8eirZ sj@ZcohЬb?K{BowCcB\ۻS~-BdPi@&_xM/hRi CvmC ]#o5*/1XJWrJѨ@*Cc70YǷ]7lu0ӛˡM]ybSxܮ4%v~ QwEˑHGD2!)#Fy-ǙZF#K1ZcTmW- i+seIPFN*Ϝ "bSy~.jl޷V޷SzJm.|Zt WMe* NVrۜ:mF7̘?.{i>>lN>~pۙܿ7>ú?b<L<.ޢ8>jXZ='/|ձ%7_]'߳ܺm>m.Wn ZIр3Tda>jo( %f {~G5F Yv=b*3˪xv5i?XwusWW؊=|`q/[t:)X<dhGD5zө﬽K/F -sdrζDɅSc[N_i-R:!'^. S;-}Ik,Wufvj9ם ;@l{t،(d@@;h:qȾ6 %$)0Vy/CZ41uBy}q(vw%my̺zfWӺzq>B|{ ?1Axg2d6 zyg]=GFiR -$\%iB*q$ KlYs3$J VDja%.rɫ1eSh-)ydKE I'Cm*$i $T)H# BhTkLÚq?KFZՅ.4]Xƻo78f1 Ngoiޫzfݚ9]t-xIB>C VK%el*٭BPhlۚjT58G؟Q3zs%ڡKkT듍nQd!L>vOǎt]-b߀vw{hA֤E#+2מ<1o50tN}dFSZi]fu]DHoۆlzř J;R)(rtj(\_*1c- ڳŴ裥KZʜerϢ(5duN?]Xgߣ- ـeTS q*EV'H0]PTr0Tγڴg>nI}]t61X#b٣w7#mw:D1BfzW- 'JPXłBϒ&X,69^:^u*Yڟng Z\QCC!5qAsDdu&}t`Zugo&ʦzssOI)&]eoL_uhN B:W0^|A" T`Z)Ȑ]WB>E "de1K"ҕRj4* 9gyryWF&;[|>rhDa;~yަԵ+e9ei0|z/XJ?q= jz/NTER*g_PzI+g++_."2ky(IIŝ]5SG,R%dR`9`0ղ$ky (#'jgTlklt9E*& 1zd"TI"ax(Mc38G:XռSoo՞SSkw/^6uǪt&TpɁ/n;A_XI϶L6:|R}8d-jjmg~0rz?4#<Ë<Ls˖YnFkdN{u^elR hɘhqU1Yg$>㪍rKT4/>UVF>$WG%qFh9IL8m\|#ˎ ;͛\!ia4vh=}7{xp}o( ^[mO2DTVʂG.X+մZrk&>5Py\ *Mʒ 2H$4J3qtL= )A"#D>zhm >1$,`6hm’(j &] V1mLv:QPٻڦrdW\3u^ZR+Ue3S5S{k3[eSԒ&`m0 @G[zE`$%.GQt ?$bNB*PL.Do5qIyk}R1ŨqH=luY%xVɘdvB6dFdcGjٱ"?xҶ} 'tmoڪ+A-8@_{/нQ;:Bj@7 bAٹ7;ȦQdӍf5$BY:U4 P1ɜHAthaI[0bЈ/A1E$42&i3˾Wڤc&Eh9Y] &MyuyL:v2r=aV!Z:2om0衳֟زCd0<7`X1 >^uB A'aVmI*sreI*t$3< #+jC,lN&XWRkւȑL,f0wiv=do٨x5F*tE:ϵ,5k x3@ye{)tsvlA:ꋶY9gt!p\dȣY,s`CVj )}=,ZK>ʼL+} SM5xq1|vԋT|Nր`K"5.R IlwV#[a6q옗s˟hEY1dmdX'5"Sq=]pV=*rRN0S藒,̾ZʕY:oP7}>'&/~م9_Q x GgPp&h5+¿ếϟ;0ZA޴0Yfv%PĊ6߾!LOv?O_FrGsHQz00aF Y;trggc_uq׎գ./rݨkaOGl^ZHذ[x*ۖ[rr8^XꛂM-?o7?=OX/owӛO߼ݕ ww׼g8S%HX+\ǝ_o>QC -7 yՋ ɸ8%<=֏;{tɯ'}=M?;*G9=s5{ibsC{ -ޢ_KtR #uq75ORp$ϕ\#݅MށlkVTwXDzw7Pb F_]}}K}W *V. &cƯDu!kGaOaVvH֩=)K%չ5) gѫ%S  9uY]o1gi2 y71I],bv)9&Dِ$U3W6D]- 5HjO>Q}[L'Fč"bʤoA=yT8tvѧcV(JVmMپ_b@g;;`pEBgPȶ3Nw!Y]YMoU*v]Ďʆ?ȓb_*ZQ8I*M PWk(,$MR&AatH2)Qc+/}rN3^Û *DZ2& &0ģqU(uλ4~5R{~|8Z| />x=3pSΔeaɖK)=I}L6ңeƥ<] ;b lW/W՗\j@>bW/÷Ϳy$X{*lo/fZE A>H+YvvR׽@{j}YLҰlh Yt&]I`Lplbpvhq2SX`|iY }^)2x@ :8C)b)Ge,a.kpuCrc`~\mMUgֻ) \,1Y/b?wY`19ڣӳfgZZb]JRt>e]R@3{vD )l=Rso U@yU]y-*A`Z3_`W BQh$%!lI|JF'uT27d9̻O(J3!MSB}uo"՟{ 5T)R\r]%Sres@` " j~󓚟=R A'*~6("I+DɵN;Qn ǨӉ'R "lF{ BG GPȌGX/C1BdcYDJچl[Fb|m6e߀ S>?) ,DU&)B޻ %VV%' +cb:ӐZ:s Ja=7_r { QO 䑊M\JmPe;1^oJeuo콖3쵬auPqi:UP_wFRѡC̵?bto,F#ՙ(X+(#ђ6D+ dP$)Q!DvPO\/]birŘ󾀴0(=bϸ)ges'+twnI;4yzYW܇>{Dh "}j};t|4qe1,Z^;LhN`2>t.tp =х47M1h/^f4^?|)]zD*=\6+#ct-\-3nLɑ`v>*C2h m1R;@(DW-P,!%HFr ]>;8C[^s+*g˙g]=EͿGDߊ(jWgj{8_%،/m`xM|v/WIg䒔"gEqIybD9͞z"y$S{V}.{:q>r|`Gh|_|ޭ Z7 ()&)BԥG *Ff5sf"(ܚ195f+80g Ef](z]xP](mv|-q*#ɯf0}ܼ| `0i< Xr0E@ٵ/`c./B^$/0bQ,ՙ+냉`3kʈhA #(H8Hg%[Y63KMf<|w 4o]Tn[{mz.rO.OQ`tY/7ܜs ȪwZ"GiVn vmt[[ )GWiʄPO/S˗%sd sY"Wo,XPP<8zxˏ7{vVgm@hDfms$MS rV\ས['M;j8J&| _ŒN_ʆ^fX>@Q,;4NF:g{f$5w jZmg-ACΠ&M)*eZ#80^4ߎV]bd )TЦŴ|3g- {}Qw)QԝmTi`PE} 3 } /(}CQCۉI#*RJڮVY\v u<݁^͆ !*uA=>&oV?(̥ j3\X,Qd eJ+gI1 *ŋɺ&{,#O)3cV ed>ը. ͘kB6oSlJa^iGjd~sVk/;^sz>c͝49W)9 m8Hx-5| ۥlrQ&Kȉ)f U%7m=;SXHfs5)_Ho%(0+@RGTRJmɌ`#X&odocfYz׵.wQYYduwlfJm^GlL:6r W:ކ"h-bҎv*eQ<6Lrừږ^Ïa<;q==|;][x0;RP-yubm-m moi"C݌gæIC~#O90RL,dKk,)Q0Y/|/|/|$NǥQ#AWX=2Z\;\NrO+5Kf)!Nws>iЯ?3#os$Ӈa.Pոk hso§Ѱ$Á4 ĭftXHKC. HѦ7r'e2EK~  -ssG3JE8KB&3F"vs"xYG&iN@#4xIo~ҹn(:_u=ʽ\0& BRD5 A|qph $b4h"h 62( F刊`) NPB]f%9%x x2bKtóU_7jkb 㴳uv\##!KL@#*&? rD2O EJH9ȱG'J!(&JD"Wh$9^J#m"۬vLE(:o;cƌD 2b=6BFNGؔey1{zٚDU@n-]7pcz r#iuzD]96~3^ۜ!gx%.t:Oln7MH=5yf~H௽iz󝪒?Ϧ-7[t[KNbk·?oOF2@CLz~JmrTD|%btL/v<*F*ծd1FKLkDđ-"'l ![ ڽ:Ȃ酰 P(FR TXmP cRL1LIC7 #$)U/Tj4vCP q`:-RLp7'9NSx4q z,I$Jn"LGfQw׿b-r JP+t L'Zr':*1fJ%a9m)U+¥& Lª1jve@t)D1"qyYV1%t$ jf`,VAx$sAhDŠ5[$Mx\YC9*QRYU4c,ȔNE"JFc&=jUNʝbrz?t@D Jt i1y#A"T6脗Ip)EZ~Uk@hJ Πx$wƂ#disc̱r6rcӾ e$"_bS4a=#HKa諣f0>HVdkZ>󈬜 O<8<"-[ioRˤb2qȳĝe\2q'j=L܉J)L_`&nR`F; UyXzJ-c1+Q\e<e,!]Zn5 BNcʠzN>eO9gH MLry[PHdh՞2#,RFXp46^@lOݾ'=ꟗ,h!3=/=?`L2(8%b)L`ɩ#DfLZJЩ#D%Au[z z!A3 G.iLr_QT]=ZG\*x ꊢn۫C BF> 0u@o`F !WlTZcd޹RWm)he.! (ؿ0mxLϸNI/Sԩn@Uo}`AYJ֒@zJ' i?|s󯋋c X]-$6~27]r}ZLp'h}֊Ptڌ_}q/O.-h=T&fY iV#͊_͖uu ka YYpne PG^fيń9t2Kֿ_4n(mǓ0-Ƀ:K,/c>%0aW|ӻ?PqɰG+l 0}___fs̒o]Gb~0)o9]wp8VʂX^+:hζ#}cӰY\ L;5NlbcMͷ?d$;6JT"H7bt,a>XXվʾY?ьw:!9cW#B*I&tq1-`׹;ow]+/B))?ݟGq8}~w1\>hN]̮T1:OgAep͗TZ,+٧]X^K ]0C6GNhLIDDR85BEeCZ;1¶BnT{R_B76 dHiAoǛ$oQ·\DM_ԁN:-Lk9hZ4䉐U6Ⅷ+kw|%RaexPZx"R;؁7>EH#}'r=/@%_Lt.\nPM.˖w&ז]";uX~UB*w:Re;֯Rłuۂ]K xi XL\fOǛ`2uٞhV:s6pb/ T# }x\{z:p2pxf t7j{}[O.&}y'd8 W@;1\jy".4sy f.rx7dGꧧ+$vv9ftΈ8qZXfxݮ^E.rkDE5ޮb֜?ty+ ,4?#G(̅>Gh"WsqSw&*_# ?{Ƒ!8`wԏꗁs b 2)'W=$%JPْH{kF35=U_]N\g8::}uf)yM=`2;E#WJbfb,-fkf!(4$B˴m AI*6oŀ$5(|)I#]6IZeo~Q,,DelNB4\}F묔'a!0̕91ѐ@Y4z JF4 "dɼ9Ywl!0R׳۳!Ɠ٧ƧaCH4_o4|Rw xow!zOq~}uĽA8Ԑ_g"4j CT!.'k>mYg&P] F1Qirn7*ylZfHe僾V ~1F, 5O2{W~}irIkuEI:$LGg#N:Y4đdR&:!HǢ^e/onaNQQ@4 \. |"1rd*`@ AIi?dƺkgъ?M/zzEUP =OL 9:ƝJ&V0٫ 駦!Z 7Y᭎BBUA]k?&6ׁp9b0=s̢0v3^ aџJeTN*09L5Eőyx'bxJUܒ;P"K&Rrdgaxu< 5cV3@EY>2#s:cbvq7]р yG84 ='m' W_`FosIsGBJ *5[4n=饹 }.2;c{o}a$[3`ӏfdHF?_NQo>q0u'}'Ptu}';o* -Fgye&Wsp6{8g;*.뮝jԬE{ID )7V>_a,3K~̏ƳVUP\j҉]l0>󏤎w޾Ƿ_\o͇w?YO4Y|jg~MnÛ7nM[ZZ֪S|ښqc|zfm HG?_zL~|f|v^=J9s6:*/J׻RYcot/HXbрu6ʽeJrGڥ=ߡ]ζ!TЕ/]:dr*ccbFY&DMgD4]Sܫ# |nZScDg]q91kggj{w&OsfM@ӿ"gBvڶ.yiLr#(02Ce)4HK߄ 59o=vaNZa/0IamYxۭy]"mKw<4] Kz7GvP6l{KWu}LVJ׏ZRNV|B+ $XmD o4gWHD;8E‹Y dimΆ,F04E,Vbh㘚U=;?٪-V'eAQ`2rAS ޫHrb6kdPN.WI1覘Y⩤4d$&e,p8/)J=pEeTMiaxtr+ty5My ñ2$uX58#V_ge@mrrZ[*Lc1^MO[[f3||~0)[[icިg8t.T Iֹ$)R1jeخHֻ;go_Nij|[=Jۺ5km߼Sb1je-1sU-fsAnY4hXQdguV6ֳx!ET ,^=M0#l5bOL,}TVQs܈,c > i^T*#LU)ڜӠX|C-m<$ܢә"O<>{i8O/ JȷY0C9 6H(SxWN^eLH ȃ]N=ժFtIt'_Sol8Q3N.9Rb1)ecO&SRDgL8^TӉ/$,3Mi%] L4^d:d,hi*PiQh-FrD$c)E[[$mFzѓ}1j*Ae[Td01C$\Ҩi%udE9t=hd}ju֡EIXh蔦=%DD,` 6w@VLk'2v(HZgݾv2lNDRZqi4%pR0,ӿ+D1%0,o@Ys-jD4'K826("̳R-M 9>jѣT B9d]*rF.KUyL,+L6e:C* yyژE vr5q [O ;_/ISw95r0z?&6=`)棚:7i`سΞu>ԅn0Z+iDG!LZ ;?߬#5xd_rn}km ;lԠ#je6X>g!E3 ِ9qwҗhI[w?P7ۡPKwmPKMp]Brb`2 ֣B"Q1e;c^9T&Uqtc]D/_%"x.͠R0Xǭ&ͧe)W^9Wӓ-E=mб;E˯}/N:9m;wbwʢxP2co(*?=M.Ȧ  d(TbY4`d@3{ښRYi)K.J@{,TL$i$TI͵+@62Vad\R^ kiƶX+cIZl˲F6?;|8hr5{g'4~86<AgFl42p3=c.`"ET$IlDi̝R!b|L ()dK&Ǭ1r5yx}Ajڱ-jʨmzڸI-uv 2(>EL$I&Vì$@2q$EG&!Z*,L)+r.ɩFW8a/JG`D"GL@rϓɀB jH,A} (j߱vKvFqQj2ų6*[u{56m6'^(*>)cѥ;'Cnc J`vJ8 K*r1zѨr%CR.I& *s,%{^"r)"[CA2i vYrLTI:s [&#g*|Ugcs~1)yc˺ ^nU? Nk^[yٔCGN|:{pnR>,yIG yMNHǭ(x4&$Pty2T,>{82[˜0&LKZ3BK[ViI[ o_,t Ѧ"e;ǧr6[H̼mDty,@@$?%;- y-8,B:aPl^X~#2t /S>:G/ ycT$ꞝAf>?{Wȍp_vC|g1@>dgbHzeǒg2_%ٲddg0c[M7,bS 1e -23RGU[y<{<-d-A=^G-.\CFo1MYI+Dp>ْXy|RfzSe -tvWQ80Mږ8lRE(zi)l={mSNDmL..&KH@Fߗr9 I"1זZCCr_IvHV0Ywˢ;Xbw5Ϧ/a`4?< r5hD sSt=oy']N񧅪G,"y>90@VۧxQ.XiN -Ǣ;u )B_ _>_.FnPfFɷCu,3XxXƧ<2]Eu LÉUp6;جݫpBc|gu fuU$TҾg,dyY^˗B\: Aa'\!KZRs+]H`V\5i_yTֳY'ctօjT; Q="/+/+/+wh=jksvHK:ds} j1ehZ*Bo k6>"0C='MYj{ԌP?b7ѓ6Mk8 )[ީe%.B2%z j)VUz>Bd/cK sA%Co-)cFԪ(9 :*E cڀ4n@D\p"@|Bi,ܘ_7'R;5`%O>ԋѝKr.B4B Q{GeӤJ ڑCL5zԍvGuIXk}Cr=xC! skh'jW&v ^ހ_@0͵W_ϟ#lyaY{,Ap]ȼwZrd>ysNp]TBSLYrlZF1̄2"M,Zh-E`}Mf.PwR:0]J1(.CDJ" ȂLаa*jh9.*zqȇt|rx= @^l[n؅wu:7BnmNV.|B?Ġ $)y늴C1 =›czM{7kPo-r|(W-s,z.n [c+O] mSs˦dK.շ/fvl -f6wow' >AOݝΠ`];v!Y(s8l?eܲe,[8 *Yg^++ X2kB`0GbcJ.XIʐu .z2+mvu>o.(1rcJLs ") ^8ۍm<1~9L wV%H2FzH/az~uCryy7l 8\Nf㥦7\\ ,'j]>++V(DtA;[_-Tf7 B[}Jv(d*xR' ! J;cQU֕j蘺AA*(ҫw@sÅD }`&֖R=Fil :P -=rnS-u#Rd f)#>8UKGU?1 6;$#EIIKdE \9Ȣ֙2p64; LX\7k_oaVmp5w~܁9Z@PX2J)T{}`xHSŎ,.a u yI&?^R X V;9 l R R +6HEK4-T:tRvIhN*`*yʝB溠#E@%M0՛❼?^=dGz 8'vPj^[cTJyǓ .HsZeA9ƍ΁96Um{k{xO;8~T(,׻[-m7''-W]^ e׽KVpydEJaWHvZom-mb[oӔ[om-5^)n-Nb[om) Pbj-b[[zJ!ZVhE0ZV>Ol[V>Ol[fY޶,%?b\i|b,V>Ol[V>Ol[V>Ol7ls*{NM үK+x#p"tR)˰sUR@r#L&߳ڱ)qwgK=ʫ8Fx@:Mϳ FfLFeٲDFDŽo CHHZl=$M*P͋[=Bu=iq5+>Q\k龠 p5>sgW3DNlt'M9]&ZKF /X4*0MLGNIc/?'X'dsx Xq sK&.щ5f4/G-gGuk˯`Xmd{d,3`Ůō_h@ONޗy&">G~4I{_etd1J4.Zܯ_xvvqQ'Ӓl>(ѥGK#pyuf.oWW|-$]RQѭ'Z͛7#ػ9CừهV}a$[g$O>GV]qDt5S}=MocC=ប^P7b'Y*?49(2e'?~Yx *^9ZurR:4}~~WSZxďO:b妊o9Ɠ/>t/?w߽?~w xwF`@,?&{w?޵]KPtk>bwj>@7hx $寣~n6ޝ]:d *$WHlR9x:-z~W*p3,Kx  _NRׂ#nVm6S\bGߴ,˓%v`؟ &x )¯Y1F_~󋿒m!/CW^:&ds,cUȬ'5֢3.fԽwd>wq KsPMΆ .9Ҙ1i WdCe) DK|;r5M'N}롎jsca+NP&>I<)GQo桯jocox+E{̙d{0qoI~*=LJ+@=*=kN<=xݧ?~ ESJ&:`:ELLX.;BiOkzً1 #LY) B:x1*[40fFG'KV*K:ME+5A.Z+ɼRl,pV)+QTꪉݪ5 xl⧁ɧ<%-;&ܜ|'7vg/3/Gycugq<\ι Q҄!u(M GƹفEє.ͷOˏO&[m1㷕t6w̞1VΫ7Oy/ɰ)}wpRFL@b!vv^H:fU3e8G98{@5 8jH[}{2q+|"oB$㏧RT9Zgfu\&J]ّ'"Cr&@5z¯ xӂhd (.Cq@:9\9`#97 GE9~i J 6U~*gJ>͏d\ιȺ OF&F5j|D%( $ayppKZj If)4QfK_tNB`6bZxh|旃!)dr!Ap9!HqnV \?Ͷ]M툲hd{2΋tu& `r[6.[hg4gZzH_|L2Ч`9,ݗ!kLS۳$%KHJTJ$j[lXʊx)*HJ%]AOZ,j]䧉l厏ra5m.鋳Z*U$#TqDUlƔCՂq1j$ZkBgMy],+6k|؀D:Bka0ɴ&IY_aAED!z$בlg;&=7(}(e2,T|\Sc&Ή @*z'3M |y){_ncب _ %-4 1S j `I\GՍjdF?V͡A6*+ ׄ VQJ*T=س5.4%;G8Zn:BjS >۴e1`gUCTlH']xB$*j-q~Ui@ѳC1n*AglwaHUT0CULmeUBZE]@ W%bnd㊌]9!Zl5A?ZdT9dXr5ԯ/D8LHMlNݓMẗnV6i 1<~Z?IA1/z9<ٝ*fù0Rn3V׆ɺ ފ +:gm]uNۊqmT;CJ 3pŨQca >(˽ 躉 B7%L͆oṓ11FωZdI>F' sD퐚QM;&nci$RzմCCl/qѕ59wpmȎxH9@)V : VY!warWC"rj1_HmK*`9GHsպ(JtlyU0U(}Mud2FvBY痓ɟxm5c^ԯz\6*k7љۓ3uOǎY>_SdzTZ@!8@N^8Ru\#3z 3Z_0a}VTO2+z YAN/8m+uA{>qF^}#E'_d.ǵY?,峷cn>0=:]dx=l}|uqyN}iTdr7=_m]ho7ۑiig-UӼilY`2u;P\ne R8`F3(HCssܳiq] Ԩ"WFt.L'xKjLG蒎p)cfKxL4ӖkDf9!TŸxݛH}D}Z~h?OC7O2=(g i#{Ij̠}0X4 fYL(1Ouklz9 {{xDYEzn{6K>" F6Fž*[d@cUĨQEkɈ@ڰ5֖Ό9 {[+G׍ݸ ~A2+K>Y_'=1]O kAe ] i6L!j" =0*P\߉*$9G-_8rvMхn(!Xz: CP!x6Xi:w?_|t~J޺}f"pmyYiL-|vkf]Qy{{ vAcsnI`,X['U T&[3A1#2.&+[mZf"i"3dko2 HFnlG,e?XK3vB OWkn5o.m/֟}U݋<`ȓxf6UCfBYT 6f q0FddiY8񢋥 XSIox`"ce'Υ8$3vg;.9 Ǯ;#bqDč3U֊W5 ⏂bd!/ Qܺm.}bZQLpX"V@d校-g òo>/luHnZ#.Zӈ#.4dG /~fs@{uT?g}w,1o?}F0!=25 ]M5 .9y5@4$<c cMkꃃ>K#jG_O&O]Vzt{Y.őNxl4]0r!js!c-DRUۚ["˒))m)o^#0_guJ& Ƴ6ɛ rcU3&=/0 (vqW,[.tӇ}^6AY&C $Etho4GPs䏠)BJًS26,]B.ސ^0XKՕ;9+\p9FLhX9I0٨lsR+0"G=Mҩ`ƻlPk8ΥwMl7qѐMcm/3krZ/vŌϗbx~Nwٖͧi^"ywX}~Ys=/ G@"S8Ukq9@UB<i|EJÐgllTތy/>Bψf4PJ/Ʈ%p mƒǪ#~Qj^?]{*=Y"phf7q:YMZ{ޛYpֻ6t cZ6Ra5/y+]0#{;g){g%/98/3nr2L|O6ZO~ݐ\#i؃:nzw,0ݤMaZ :0 a\5 \5im8tjRz;++|R v.(mI:\Wug=kugcEf9whCvC1+ـDGj3T.vjݻ]-}eyǮM]tWC"ζ)d+dQs l}!.@XpҘQU1DJg)WS^jkPJ&cރ;{sHG73 4v3Z=d벷j8sF]ԯ[nڤԮm,d!xBb&etv,NEA`蝰a;GMuBBb5Xkq0CK\#{ 3Cg{$ߥHWڥUr?hu=u 胦޺Yچʯ߾kjءN&m'_d.ǵg)?,峷}azu& ,n{XQ7~7aO?'sr_ޗm Wޠ?n#OO?!Ӻ}7 y"BL}q$e(BLV(3ABǑ1Hz߯j%Mٲ?)]UY2re9[Ȁ)hO, Ճ"h2Q<DFzεTmX͚q=Jy]X3 Me]h] Weϵ'}f%vO,v~|OAOO'OOfWE0 dg x\D"5Hl ОZ dLu~%Ȭh`2!EQhT0*Yd1pY(舘Is@ËZla2N+hjPm+km;hxfu1 [ YMvLIm !!Qfi#@-2cKhEl>XOhB&YQ%]E#P@,t]!eM%H>Ff}E1FՈPY# 9.%YeL٣e0{9<ٮ(XC 0FPQeɼ9E8c%%7.1)U"=RРV"HPVzU61ֈ٬?#@zq O{ոz~UO/A/zqm-G!.jܱ'w6;[pkȍ/|,\$Z-wgr$;( 1 G kxl u$&Yb@R/ѷ^cEE>gAQ#wZ'(tI]V6& 'RP2Z$"YNDRU A(d24D%ҙp!$J̅3+$rHDg*[ҕs|lΧ.˔7Iq.6}){O\'Gył _/\Y(x6a'rƃb3()1X}q j6L]Ō)8 ! wEL1_:^ YXu+TjƗHvJv>%4Y-0bY2iL ȆA5xFd4O3ΏӀ>zUϠy^ hՑ{vYqT|P1lI¤h#̌Qf5h*uO J6;5NjzQ.2K#:hȀ!Goc&Ŭ$pmbȼr>[f`ԓA{&mG P&նNfa/ ^ċwvA7$~koL}9Jn!>,34~\ڌ¨dP ̸ARIiNcpTknt~jΗ-5 n9#`@[0ENƳ }l']2NG]y/&{&k7xi^uFmvXg 6E(Dr@K vżS BD 5ˇnKi>5pѰɷ'zsf%ͯR6xLY1S ߅\3pb8`ٌ|5+mU8 ~' ܐ (twHZ0o5!ݻOB7#.O$_M˟Ot-Pns~h.X"Ė>3g bb2Uz`D8\LH=N~%<up1^;Y=!HuxsYccSΠ`M.6*cl̹r yMe&M&.r*U({.ZÈG-?Z !Xal2&XiBYs Mp("${ u#.U"Җ'㭓xL442Z$-jl`bVONO~:1ptee,DzD/aFbĞiG(6G_rb,zhRF+#8-wI\ї(1- og}$s2CiZ(odUItFPdɒH;cvjllx ?֝TA) ^ D%#XIU3Z[M,#k6H M|CBT}uY\pɱȼ"# l@LX@g'0>8QUr'#A(m*!eh@.elDeLA&0c9GW^s*:#mC5A3^8#YKI'hLS|gK&ctQF)U3ɒyB,5dq&'N(ul{n;ֿ&6ʼnf|~PA֍xPYy^8qϳ hb(6$ST1INn  |qe>\76io_qt}'UG1GsjWϤO$z>{{{wU6oٗz5t-ݘOݙHOw>VE{Hmy p B RMNq {RJMG]s> JgTjf\FE暠#Kx,€^e>KYFcR;2L̵2r-+sp#qV/ǽ՟zMzekoLc~Pz,uӶXLPO8ݳ3IR ǫ[דdzɸS7aTl"Q9hJz 0WZA#wAU]qy 9 &**zd[iKcQ"Є]SiBP!y|~q.[rMNͶ;onwS/]moS'\dTAFQl4$JDq!I]/uV㺔Π%js_ydZ!t4ғPjFf g9N&BquW헱N6Ţp`]1XdY*,˶DaW)l;`92<l\`Q\SD7*I!Klqr4BH.2+*id"'kYQF '|љ"zDCLAqR's\),t˜ `l2FΖcWXu39]2֋rӓ$qiY}-VWȵ-2)s_ \`v@d [sAn+.el\qnW]7wzn\#畖-n7f|&Lؽ i)1}P ȭ28x0&,!' ᫺=]&CC:])qBvdn+Ow W8-&#se9N/]kfΗ&tFX0El] ȧ68lpjXЬ?5 `*1tT` A+B,pgz ވ *5^u6&1a:B5v cj殧^^||u'Wfn5p4g L9VV :Uځ:], O2rW̥CI9^mkyzAe g,HfъiKDZ+@&(FܡSt<ï.KZ,%ɨ@EdFE2q/E}B8-[q Ѡ ^2:rtb:B18bJLh@09UP_y %>D2lY\ŃGh7! !ml=`A+H]H(޾n=+?W\DcLӴlV| {*=#&U8E%?n,4*0;MمMv)'<ԇ-sѫ:@@ٞb3.88gGRvfxN/fsRgx'^oBaȜ !"7]҄ďvc# WOF~4^'VtD3򯮿Vys?]juZ%9Aܦ rs`SǴ5jFOԵ-mFݮλwUpvVyv2Χ'GNjzfW6?^ ojjOqbo$?t7fQzaVY=aQ4eJ>^.'zv=|9U.ogQgDR>3ŌWC~Gő-V:U|MxNƋۣxr~L_ow/>~\ǷwѪ9ƒBh8#3q54Zfh] f\=?{WX?%iE0Hv2)X xTmK$wOO{O$-[=WUUǯH |_&>NjU;fgxS.98-b~e:^LzI{*3x zH^;si#=|D9⛘aF4>qҪ 7'0^_|0~-YT{m3[o?1NmuY֥CV.*um\Xp!ElȎT 4Cݨ{nF?~g`Kso-& cVR6+"H+'M+V(u3:O fa9!'#A'ȹf"6Qil _/bxar>40\6vfMŶK侦5hErf ʲuo-GEk!$KR%]=Uз+.gX2HZT0.Q NkDh sFbp6eܔ _?׺t; X0?-7v{V>NqOu֡Bh -$Izgm,t:$oD1DXKH !:f JF%3u$Ew(\9+#E!$k 6z/Hw K )u * ȥ@E%B0^ VS+gbKK֊15#gb]gD(viH>CNwl+h4UڷSW츩-34る7xLBd3B^x,-!V4$ l #\TwߌixfUV5[< fF|AA䜒z&T{ $ GMk&O*]=X*65ǫz%0j~)Qh"#UMxD7tIc a-]G P>Z[vi?<=lZ(*A',f#.:xP WmБivP9S*pVj-2uX3KnHnxwuU/%՘|z\<1Mw,̂ ha^v6] ͏?EvT$qidlY}Qsd΃24$GYg:Eh+Mdb[|L&6(- ئeX  %`S _t&w%XUUL>5FwȹC tHnQ=i?£:G(5= M1ۼ|>fm+}cGG/ B=:*hK,xVG~+AhA 1h`bp xT2:2z 2QjG]VyϷ2ŧeGs>h *msJ]I y.o>֐+Q'jEp:Yc~L~2i^Q goA=ճtw8v__sdyO|a3A¨''Ʈ ]qeuel;z!M0Pga!|0'SEu!!HCI 7"^]Q>>>)Gg3lpZ/@[,%%6)l' [;vv5Bo b-u>)NMo Biy}!uT0̳yO?Ub׆ލ| _g~qKqoOg/#''s_XՎg\_GQSveR]-Ų|q`KQ7}0yG~_NΏ/ڣC_OEA:P2!H38Ǡ/2TmMqNJþq FDgIyqĒԩ@R(%IMEFJ1X[%>NV=q?6 ^uV3|8>uH!}6Bߩ[!*9 ]GTMV(DS{X4ڀmӻ$濭*ˌ&ɚ̮c- @:89whw4}lt&lj׌Oo$ ~9oS%|۫oyr9[[ŽLcb[ZxMLzBi2 2e15j 1Eg|`1% 9ڶ*RF|!->{ ("'RZTk،q7Jy]؊3 ]c].|REhK☮"wo_Հ!>/o\cIB$LE!B"(Ph)&'CVE' ^e_,J蔔VUmjٙL {R&a:2k95vMPv3xֆZvgP , 6sW(-8F0J&d`-  *M!YiҰIF`]SPdP\q(~#BK6Uc}،p47eb<،?}cFq{lI6R1lA6 E9G]Q ydCZĐ)EKBe6Fk Ƥ.`U=JޠC'Acn2#HUỊ5b3rkub;$_g3.y^\lC׋]/l*$L2Z HdB(Qt}?(RPŧЋqǁÇ9~ꪒGc ף_&zQᐧIGS&La-Sҵ^jF͸\zڸG(hyQ KJzJ'XF"=J!bmD25EOL+R&qΊN잩׻ݧn/IڪiPn8/6/g2'SxZl%jtVwްu5jҰwmNkyrpv1l|[k<Џk⹞|%lz˹rհǚ ={{wp>;]ďR_m͵+ˊfe5˥6rrbYX2c&˚02)I>ٽ5s׫Xidg&O=5*h6'|a8XCRb`ȅ@!y)mrq6Pg}#v \l ZbL(i[}D%_{,x)vTWJڦk1FJrnP#HiWB6]H$f)J&^ #6d3)XV K"w}ܙ8n 쎦g(!O}6B"Vl8o.B«Mac_YBBY ^ŊFKhO(}(W_ʻc r,&KS0*nIhjF3<^vIXQ$il$ P^e]x" B]ult쌰Z׾iGuTgPo*VGp|Ǐx`R9F^;X}ި9f9 t=c;0F hkU?&@~vYMP%dNX>Bs~v:ꢔZȎoJMQT DnA >Pu %6`r+tБ$H_ 7Y_}TF/Ț}Zĉ]vpb KĮҪĮRjsbJhso-ߜ4jI LhGG)N! I&FUՌ^|n`a$ O h^X1H)d:]4PtF޻t=,R Ϥk֧׽j>o=`:UP #AJ՘}#!Ux}5ƣ0.l6(lDɞltX\ |>xG+}(`4@G,{ʰ-ڋ=c6NB:0hS -R"(^Y>"BaY؜M>}~ 9lYɽz6 qmg9w@M8?g%7)rGt\WKRJCh`GB +tJrͱF)nC4c8KAJBFE{ $xTP)eJUPt͊ڙ8kv+\c-XQFSSY>Nur>Y.R=?vJ*h ' I/A M*}gg}U%ԶWVSFd=A>o"ӕtE:C&GK>VѹLdeH0{o$IrhrfZ}brHZX*CQl˳V !٧8() 0`4%uՆz3'ms-I9 :$D(]"I6UXAj8K;#,dfG3s+fulo#OX"E>gr*c;I)D=z%te^+|㹮1b& %\=\C"!/>[$dKqz"uH"jE )c6/t>kC'iik7)L{ ] GylQN}`My<٣Qh߽dIFRZ|:<׃X? :}Mi3Aѧ.>NcGGOk՗ь|M .>9夽ϊZʔjM啻֣W7#غ1{CmaX]'p\Fãp5+"^h/0q!ֵַN57fV{VLIGQ'd<UÓ} ~mJꢓZ]Wg]:Аڰ֏,Wܓ&.V?=jyM:xxc^~ûo޽??SKZĽ"Ev-yinoѴIӦ^|v}]^m-m'k@Z|8?8uy۟ztuf2*Vn<ȋ_aYOf &JRyv lA< X~Wo㾤VK^MR"fVfR@:;pq0F5?\w1Iѐ'`5{m8=i29+KRmP]wDPT\.,q6Ρ* 64C={jbnZU}OH#/to] %FiBKHhe'W6&K)_J# m2ms:h:d׺nVoeWKھ6)w\Adj6Ul}6+BžjVIz}J.TDT5:'yF4)F\\|&kd2N6BL̍RYf%*+ 2d4+`P6Q"tI2~ɨU #ͤyL=l[Q:J=r嚂K˦ #'l^n7JS]/^w'OG?U7OL$ 8mI掚CHYJTq9֮ʅA+^y;Q޹tiPw/շ<7j,f^cӻþ3`ġ{P/CKciy*CB1ny>G p<вum^c h6Qi&æ|% *vT1`Pam9cVjN9ŬGN%ʐ7}.X0P4s>I)K`Lsd wBmX؂եc\ۙ8K>.#MVsryiƿz~jyJa {ufW^?WW^5Zmy`ye?%s`#g/oZ^bܕCXEdkЀ҄ESd5)ogSll<[Igir!@ "zЦͺ8Dg1 ) wUzdY M!OASAJ $15igXmp)Tfd> Ycߕ/g4aV|uپ3 9ⵔ2Qy:]tY2޻N PkN,SZJfuBPAU ETxx"Mc=L l!v9X,T~+Q5634E]b9II*1P0AT2iJ`UK|-QoO? TQGO*zJH@*9B9/FP0ܛZZx>qRn$$* DUVqI6$JII󤳫}K^_ol;ECٚx%4MU xc`AH< dfS ' /13^ۺl&zDOwE:["U>Xś8 sN?A" tR$HͣR0d C )3ci589+ yv|NЋi=X1Jkoiڷ4j֐>puiy<%U-U&rR؀)*6~}X_t/E]EA-–&Q%z!YV]d I >!)2dIvթiJWZ%"eQI H[XJL2xﱨSnk*yشCq8]5mj,S{;y|'*ћeb1TCx1L*,/IJͮ3T)TA&P#]0O>m∷q>`46 Lm^7EjRL&"A6tlSٚ' *(-5Ȑ8WJVd!Wk  .Gҽo2O`Yp܍-)q70 ^##yrq0CGjσm}ňϾ51N8jئO>>K|J5Һu-cZE#H=7z?wtxp1n6 ժlŨO{Oqr6#DO^@1[ TE\dKAX ڰ_Zʈbʈ7f&0nm/B|9W `1 +ޠtr*$2B?{WW\H*c }X -XI'a%KZ)ec[u6YEc9LI0͛FCn .fWmU*v>xhZa6R' icIۓ5.GN-l'\&/l{Gg'XTo<+|l= ^1Lcnµ:|kQSQyb1/Sq ٖJ3p(VILeڨ%2A5̭{k3 Eֽuc\luX226lmRq9ۑn2X83q2=>)^Z41纰7ug˔O?ֳ??`?|E&rOM5pu*9Ö` TtFNU(KZ&4sZ_+8?2e{D1I{2 iVA!i 9N2zpp.A*Qg:!&AE!&}cr}Li ihME̖G>1,sQj\49KLŴ=.YQlOZll4JL1%J5Lyl-hi"ei3vfyǎ|' u>uW寝|w/XǪrȗYs_5 k\Keڰֻ]߸6$߸n\ 8ՕN)[֊rfEu|+ڢ߲ujEO[c0b jSp4] 2!N\o!pb2W'W]W[ё+E\C*9IR'O.vRtfjXM<1͜{7?)HNJƋ8<( ,^wwÏ~@yf]pC_ar?<ĦBn sl8PJ959VG'A˻i ?0x-1{6,bc.{Ok+ P3dx*0X&Q[#nme-K65fӻCVl L3g;0Vp;wV ӱa)|}w on?ۻO3}"|\:p?GV Nsࡢ Sb\h d7؊&Hs1;AjE9s|rgz!OQf̽'ɣYNJP7P"Fƻy4s }=LAAs=~j=?0rw2^z\;W;y56IYyh4f<0 yQeb} zZ,} 06Df+@8FNK3@|ܝz7F]Aی(CEz"NMP11a|[K'.l}d,Ǻ8#fzZڶm3x᪥yu!m散n9cN/M 6>I3 hsv޲d8/x7g+NDηk p?q ±d F_{~X~T\?h?ɷ'Tkz7Qe/GN&V SZʡxk\6"q^5'MM.bZlD ;5iu6v[z 9.i%n5Y7vug}ܕͫ|ENhG ;U[ 4~g􅶟RH n R~Cu۳>aRYW%R ޺oI柫'dzè/RlrᤔYBm:DlZΆȌ:[ޖ 9-v|^$l/𪖔Z4|TĖxUS%gss7;E퐼Dg$3=>5_,ߔ~emM{vBZ'cG =/ v}Hճ-:qf/+)#9Vѷ0#Gl(=xt[=eo=^Cí'#hKTiiR||ی&o͗R*ʇmENޜḑ1GFhL͛뿍A^ ~4ң?6y٩aow-:]ۿ⃳5V䩊]ו綅m]Ivϴ^iEC+*ٸWSmk)6 vˠ +ˠeТK6ғ C^M֦hJBݹ(>hjRK}EդU % +ߕHڱ^>sKAfȒIP[HU_RmP26m/ Hï=;Ps9 |kqO,9>_]|Wfخ*{d2}!e󚔩WLa.WLamWR^4XC_xx_~eKu q0<қs+ċ[yU=GnJã۹s]%dcɟTId2"-T1>W6в{_oHdH_y-Gďc6!XP=@ <\н b퐛g4hFRʢ揿}YmjŘ/ d; |8S]־|P_&j+8=,%/">X@Kݸ5xc]3$nOpŋ'uRڀ- by-wh qPbs =bXp78CnşulySח9Oّt{[bcI) &mG6V=# ͮYD .#l''wy_mRg[IǖHĦ$˾M m,ٿR:חV8 YQoZ*l16WMnGh8\mލjVٴc y$XrnF?E'gZ˪YHT/ \DZhKN>P"ЊOC0|^R_[q>rZ\ mL*(S j)1=%$\~ 5ŀ`cE oƄ:8TJlro0aG!Ʀ[>g-P==;xrøl) c-&@dI >ޅ{M39j=DM"@`–Q7LG|^>oKqF}*(JQJTQufWjW=;Z6Z:Eנs8fMIL ד-[7m$3&:F7ɎfESY2bi!gs|@z5yi0XK!0PnI }ԅ$%hXēO͋L,b ənyj "7g,\8‹LmX:BgP:7EvivQdF,+UKJH9Xb@Lxؽ9`tV-рk!Nǎ:]8$8JT70QTU,HF+x}Oe] %oNd0+*%qUvJ_*#{`"E pH(fA e >P@ng4eFHU &/L7'crmQ,KL3f1*%eSC8Y !LH2.Âv9xcU;=ӓN+?^hKV}lQ{].Z6]P  2P\ZXtlcV2OTغd@mX y$"E^660M5/b$m%J$JDNk* =Iq .-Kg4|BHΌ+.K #VP܎m@Y]BT& :`(y F1pnPV.Jpiec]1FۗЫN'XJsV\A)V !i,{إ\RX;Fc"/I3R/ 8윃<,}]+ I@^DJ+$rU65'|CPT&o_/룫T5B(׀I&L bzj =WnCVeº~`;fAD`jL!t9V׹ hs^j3 @D#+ ZnkT|9 ؑ9 إ"Lp4 ` Z؁H-2p% CX< s.1p8U*֝H>K4Xka`2]vqמ,hfkTRpX3K o^56U2+E^SX0-׫[$Lz2" IZBe%kTm &v k;Hcy^ɭN ty۴ǴxTVkA$h:d +BW0tBa5`&ShJQRԠf[cI<5Ae`F|30ƸAWz~5jfn0)4`ew(3hs 3dkkpuRKh2VRP#hi ;RLCX ^_ZaIV 'd"@r\?0toAlQ20v*D<ݢ>!EiIUk`bn<`t.@!36,|Gϊ| 9Em,2̍Lu[jxUU gFX;byPI x C5%i\4'Xe w|{P*?-MgaPٗ9%*k 6,20(ǔpòdBf}Q8zVB=?`S 0Š;}|axޫGQZ6ِCL֪XU]< ƲK1v2+Ĥ7Ga {Kr 0[@$Ұ؄YC(<]0BS +B ?.B0Q_xmNR卪U6A&dnM%Y'XV4"{!}?s6s?6`FT)h >qWbZRM5$w Vƣo><I_fCM_ 5h.پ$Clp5;߳W 9rǹL:H &tq1뷒VQ=Ĺ{oy<g__%Zu-i_A'hC(ByʃP <A(ByʃP <A(ByʃP <A(ByʃP <A(ByʃP <A(ByʃP <('`<&AyxP㎸>xHi4<^F•e` J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+J+JLKyD W(Gp.4ǒpJv W(2pk'çsWptGs1ݟP xκQƫ*9Tj#hmd#w1C2g Ϭ5[ K\7! YJ_^+ceٳ6|m"f/aWSХM_~]\+x<]~n^ߣJnP<&];^h_6:tqmo!w6e5McĀREtk(Cw7pwxw3{wН]|wf bw7pVhywl;3ھ{oLEw4{]ᥓ}kkh._yݕff󞧬N}y13NթC[zsS} <  PB0 `(C! PB0 `(C! PB0 `(C! PB0 `(C! PB0 `(C! PB0 `(CyTcHX"AKHPa(t'KVwJ^i|L?uuJMVSjT-rVa)aaP"P"'nν&L4κ*1fLEi8 \0v4)zq#ݤ7q,/em= dljFdM3|)ԕ¿ a#vҪ鷋wl5AjI1KՖ,p+roi|ZuΕ>b{+' B3pfJਢR%&TBΪT7Z[,6Z0|F3`0vNÒ(;9ic쐢b0,g+ ubSNT.6;96)iLS\{|W겶c3K_Ҍq/K_"񗈿D%/K_"񗈿D%/K_"񗈿D%/K_"񗈿D%/K_"񗈿D%/K_"񗈿D`ݙ`~aDd-apw*%O ;Z(f ~<(ױEi:x\H "qQkf{E-u^#`4=],Y$tjړD'h{nNѮ,=z3Ie:4 F] W2ʗ!}o?v+N}9^dOm܍RGV@W}er1e&|yCpKnӳU ukzVך Xm(\h00uR'oC'ƟF-oe6jsm o;U@ޢ \|+) gXy w~5Pc"?vAqhb0 : r]𕛩ȟ7zʛ>z$: y$pοPkՑlE{RJ`0F p8UX~x(*7ʲQĤc%fo=*o=9:yQNjs0ŁIht [Gq.$k>! R쑥焜zoO;>F W0w~\ FSWw|\v)Yw[׮E^..OHϝΧ'iַ_8R&Mv(vFqݮʏfg^7n щc"|ټgٵ"h_f0Zuy}z~۟t/;>T4e4n(AEeGO9vwOlS/wz>uGe9CIoO6}5իpGx~fpkFѨ1ۼh~yg0o?~;.ܻy͏߃#$<A?XN!7m64q봍brhwd>7Tx<5xLb7=v%$)DI, Ja+^%!T/P2|[_1ۘa{o,m?]2k,IzQԷQWN6m@zx+TGq[{47(sǒq㰀<+ʸy7`elӭ?m>N~w!\Tdq̸F/D)Bc8x)!6k߽"[g/7e` $&1D!\ULh!tK<_{ѣWכ8w{u&NƇT(=#7F;>pwR;;SWG+Zj{o)}&|] s65^aljV1*1;c0`:"U$mTl4y{ g[`-ަ/?>oc;L۰b7ط<1)yMd6Kb iRhޫ4:H^O4(Xj"|=re'Orc=v/Y`Y!* [Qe1d!].ĒUNjXB@[GC7q6!~*4qPvywF0B$ooڂ認tWÈ4`C3٫J5h_XD"E,2aRi1Ed^MDǒ&X"PZn=" "^`Dw[h t|^_ACLl\?tmN<>=S'OQf f"\k|z9:jO{iC!S|mT 2&bM{Q+qcVŬYa]IR:IidH5%pą[з{'M4ƬQǘٓ0!aI_)vhIܧDž_}n76ڻH W=kO=kOc9%_vW-͇^vJ3˶Z~lC3̯Bsc*ޕ?N QJSD,We5*wLmR[+픂DJM{d m)`%BA(eӁ$:䳾:>e] WK_o_Lo[#S,7I$K'=y>`^6χ<S/c|gY(ly>zk"뾞^Ks:w%BLmLDxStcVZ $zEW]{rXR&1sZ"VIPQ^U%EAWbtS1EubN!F/ K:P.Y2, 6:u,&wΖ-(F։wMѰQ8͒r/x;%R$r ^ YB' cPo^B+)(X$j"*QD,P!WCYBvP9|4]k!NcL(4Ĩ2<jDi<#AI! "UrbOc{Dc4+@߷]Mfu,ku2W: 'T}v}B>rPO'T* _uVA=ևD0R SUb#a*Z0o08Om~x\x2vp**bɍ=$DttߓJbty79S?'ͫZ7cP$Xb9Y5ZaemD0cH^Y1ɬoC%1/mٖ/}uXKfRtPgq-$Lғ7"l :l:Po[}.'%@& SKZaIR`B m0Nu`~J37s:͸=ۗ&֭y,ZEc-2&i"gr, 镤I9c:C%t/ed޸ ZȐeJ3B K|yD #1G$H E [BZYZoe]޽_+:UMR6YA䙲E\5T1MZm:'>K.s} KmTar]L}mI*LSR8Y2YtZ3qp N0i<֫G4Vt8dʃ0oi ]B{WWoڳݻ+sOo5|XSOmf9s|+Vf4o=5VUsVh᨜ ?VV]Q1~ۏ>riOhbH~$ct0j0Gw* 3+XWb'WcOMvT%6QpQ'K [FSb§WTޡ}x_OI*tףO~xϷΟxj ؝An5Vu54Zfh ͸׌{|PrV?]~6\ƧgGk6;sfj<ȋ0L3p2H.Z~*½t#xN/}_'r's$lvF)w(6rgT3.e Eݝz.%w'Кn/u?#KHלDM_*_+m uI֥}R.%׵R`U%+$@r`F uϭYMK>XҜ=,1Iwؚ] E҄2%IQ Ur+@7BublG]{o=T1N\S"b%-$)p2:ػ{҂'zmSk_cWT$cac9hVqR(wČlKՕQbOExf} O.e9h+lj隔 gNHV)dN$ 9yP*"ˬDV^ _1/eT("J嗜*ȐTlRg&x>//y뚄M9[wpZz[O;gaH8]}Y3¢6hgnWBkR=vBs[i:ϦU@ލMoAG~ŮVoS 0WZ!_K[V}Pj`b~ }Ns]YH'iRLv'|lEQէ^7K (Z.&9A˺kfZ BM(HJ Î#aBx@E=k 4.m8K '۰{aݕIbW26UF|Aab~yA P)\RIf#dR&~UX4xOQn\0M= PR #)ƥ, E!6"BV&yWCA+/&hD)QpHAbW5F RRŢXйH<;ْω vs؀x_bn0ɔpt l]T'olQStķO(%$ -0S}*J2;Q*=D a` 3hX21{V3V+MOz91.&R XNeELRRRJ)[K5 εQJ o{N|#YUxkj3BGoj.ipEd@! *&u^ff"%K"b9'쩭;V^wod_3y,񙎉odab0ɈD w*%yAJfZ] 9$H3 t=!y,&.::ÐR:[T^LŚa4mf[j.pH}^7N kV|r6ɣɬ~Q߅/DXEDCc,S*MAy=Y=wyT:֘=r1{QH@E19)m`TR&:\]`Yۤ@f#Q$CSVZ_bi(bm#]1cIFۊGKs?k'tsާHBVd}IS;;*?%(A73Wu3=B,4^]3gj>S5gz78Zo;eR|C</?|<5L[tH,z iԓa81j dT?KTt{ Go_D 5>ؾP2z{ZlJ.q'"-?2R~yV~~ d կM$L'!,:R>;YdRA21n8QjK˸innϟ*8iPb<4L]e~0S< =7ٚu[?ۛ_F_FFk4R nF;\7#~h(`lr[6Y] k:۷{b|W.k/۞|!y)WVp~-ZJ} O&.ϦB4ᄏq:o~@"`C hfP&Q蒈.},kg˲WZ Bx(Q͖D%$jSbȅ z:14'p҄<}I<]3c '/,YF:P (4K8ǧh1F=P7G] B FʨjcSazF6" 2sMAPPȀâ1@lQ(0Dm!CdJ ^RcW;f ;g=b焽Jr>~}=$*xԙYAA0Y 1$Bn"XO^'5TLd"Fda* 3p8>oxfKQ8ܫX䑱XGmկ51 |ؾ͒(ՋN=Uex@5o?-"naU|TR0M0Bs!1!ʘk@eb@BۄR\bh_&ZHB. i T)&0ȏxy̢ 5sqqX83a0  -sg{^(l[]zſ:`ʫoOخX¥lL*4䕊4\rR~"eIEl B8iDR{3PaՀ%`#v&&qՎʢG?ea.F9Emڴ`l R{ x `0UHRcr Qe[ Y9E͢+0)4658vv^ȥRjV~5s7In&Lƹ 0/"D\.&Yʬfh'$Ŭ+F =G(]JSF 57f`h2 IǙJȁ%ހ!A&Օ:zx0s,V 9u{"\\pN> gg%qٓC /J5']kbSS .>.Fqwa<|.#uu}6kQb e(6G-ՏTH` >.Guso4̼O[q}jJK+SC 4Ë/O7>nhfIYO_FtmMm) c)AIIqm!tV@J <@d?Oc˙<:Kkҫg +zrx6hS|joZARlN8v +:B/6YQtAKet#̈́jbbMaY:BE'hXٍpբ;nIN4A;T܈a=~/esOic>$(l%)9LE1YZ;[dAGA!nvG,MrDʼn~M"p+I13r24,Q[Cb1Ͷ %8 .4b=r$u0syieA@͞b}Y+1wqOwٮ .N>M P_|/Ӯ|jfMS$MD5iJ9 4NpBvQW&<.ߵzBςd %FF& $hBZ`|<Sdږcݡ]AҶmUۦg' xfy 捣];_8yOsξlCvV=Mʾ\Q@v\n+Wn:XwюplpsdF_ /^R*gRxвwX3c"FLڕ]$ؐ o[?IYPKecXxsMkϼ%N sM\ bAi!9ٍlA-#T[ Jh/7V9|\Z[}%Pp&cy~(~{i?ط|,Xx:yz)#RS(S1NYb*Tz [2 j<2xd>1JY9/6Il>`+T6dE+`XӠٜ#* 5ω#g?ӣ~WoT [+ߊ?<yydDž=;"/j{1h_8ZK&][" 9֯.Nmwni`=zNJ}߾v&'!@tc)#zo?_-}M(eon_>I~}%H6| H{}=x}!*,w0:^?&_Kz)ճ6 _[=ou "%jp— RHI΄0wRVZ )y 0i3[yBiBSeJIl] 1̵Lv֌Q3w?X^ЬoS3z<0c7iP@h LfF[Dв)ANcI$#ǂ9tN[GO{f\Cgmˆ]_D-߳ =)u9FWͺo}CޢgQUEDTeR<f*4P1u).X؇dR#P2wj\m܎O[Qoz.7E|;̮s Jbp-+>$@hAԔ([UrSci-,v`x\iN#y}7A?ZjQ=drZKVoD$,SO,&|}!=VzXD`*>d$*۬G,ҡt0Qa *yX!Z3.k53Sqt+- E{ 2\!enF/, }u҅b2 :s*BO@9s>@oʹz}SJĘ^ P|,(Bqz+.zquߍL>zfS)A~L* 2d2|UD ZѰoI36OJJ"sVvQb,j~Q{?ҝ| 9{GPƨL/ y&CdN}>|^'WߦܮV߼m_'OkId )%SY/)ma?s 6`Flfk.-Pq`( Q#} 8nB@;sf^\4D[A Eq:1% l^xKua9NȞU ƒ)*|y23h$3;hj*)Zfo3L2'T}+t>ol$UA8j+c]Tc 圻,E-a*-A˨Y a?6D֤}E 6z_I5zjF`V{ c8̜8Qq_, 0,Xc]~xOՋS _u+{ſ:`ʫoOخX¥lL*4䕊4\rR~"eIEl BcE!ve=(Ł jKDs;8LjGeQ䈣F20bv#Ŝ j6 FmZP{awV@Vua-qd<e0MJѪ $1 ԨC-jɢfkX;R_krI5+9~7\q`D ".xG,feVSPQ4KVbܕ#upd#VQe)ZAGI4FLiFp7 ʷ<s{zz7lzN{ü䞸f0.\Ss@YI\ld8Đ(1ŋR :Ӡ_Kc96\| \"( nxv-G?lȣqfwy\Ql ([14wC rƌݠHna`qF4! />fFԃ;Ԏ+hRQ( -`دě.߽|Am4ݹvBli^5_^=](]}OT4)i>k57 te+`U'9tZdK,( ֥2:fB51g1i&0sITrt!͢4,cuVFnjQwg'FnW~/esOi5%q6@a+M9e%,R)A;[dAGA󛘲nvG,MrDʼn~M"p+I13r24,Q[Cb16gRM|vwRUp:Z`9ἴ fO ϿwG(_~gZ8Bݽ0Ys -%]-H(W=3HCR҈idؖ8_=.7"Zlxᗆ~)fSWx͔dbhH$NwQG<)icw_:ӎWS=ܭ#vfezSgۇ7t4+S᧣2L+/7 $|Z恤@9ɸiwb͜4V!g7 uh|jrӞ !O}7nh>ntGMi>rb>]q//ͨ:7 ?f́db&jmmɖ1y'1͸z&>9;w԰@]{A`xOMxTlb ,zrGNވ{8цspWx%02jNYw9=0z`}{끩d*HἶQ8,ңUb6bq:;<*DSFen)nӆ.%6h>j;EL$Bdv!e{3h#XW rg3gu6 _Ӂbn;tt t$/B]q m\`iH9pV!F ƳKCFAFm7Ԭe]xc-#)E8CYE..r *59^1WY}\}-/ܕ>׫8Mc~!ҙt-6.rwW Mz7?G_ ީLmq|wW=^y0`ܲkܚׂ+zo[Fn:\2bH v2ʪ{f2>wcj+LQ~wE"-HLBh:PC:y#r$iM.!82D!h1Ƚ "D锨#,JrkQ}avC{0ռ'Rq?rfZztX]lnzz~ kW(un}qhXN)N@TB Dkg]hUc~tUEKqC~8De2 rWNJ:Y P& Z-)x)ʷakzS4j9UӭYdw -P!vzUWFA>%*/ŸceOBY+'Q-e%58i7], *Z5u gGfRtTk$(PֶFv ^^GijȦvkMQ5j+:`ɟH3έ aD3Lԧ+S[b8f^JKf^o%\Tcu&릓?Gq6Bxc\ε3B$stu#ut)R&x%8sp*,ђ5I'h F?沮VdQ 1Pcryo1N@]BwLl&FΎ7+pr 㻛Qt뽸qEz4cK636ez}iqn^೙!R6I7Dw-VYڭOj>umݴFJcWh +e)' o/d\3\[/,@I 1C$\NOP#] n6 nѲCæ,L|-SH+3%}' &14!ݰ)6|i>NSA'73ln33)kA5`ZKiUV٤aRkbk.45 3E˨BdHAiKS$e5rvtL= H /D # pA- *n]: f 8?(f* r"fk$ɡ '@Mr@y; c~D| 0sɲ\j*@Edbr7NƎIPPh{fDFҥ" R&FM0"}'cG$c`"?eMj#ׄo3sh2 3ƞLf!+dfj9=L0]f[,DԭJ(>t|3h ;_$Y!8 Rhp?|T*^PfMNJ  l-g]U4 _'Va+,6-@#BljeBS?2_p.B!4NO(%H }?ZE?s*w!M0'NT(S $ԆœB3Ko@@kˤT4p ڟyOdkƆײ4QLB<' (S>f㭐Rj{n Xp29WA Lkm.;]*H ktF BT\f B{ XeeʁxJyha݉5\+idH( TXwj:!$LBз^=FU˃ aPLEGQixT( O}(NTUw|w^n%K61Pn*P4h q29y^^i>EJ;#*iJ 80(T!x7|0D} k#UBvلF3NiHg<*5ѪD=Z-QN8e%#1(Zk[#g][cky)ąY T.\ޜrqiIxbB%ʮm3v1eSۋ 3c^htsk$>Lpu&o?:SرיJC; לH LU'\L{>a @M̑=Vk,p4`Yy?ZQ_a?u.KqPٰBڜAίK 4;Y 5-U?/Z454!D m2[JMWoyP1oFGR^$g?>M(7=J{.{B{kJ4A6YBio٨HxS$k *\\`ב\ɽB$gguTv,WU\/rOUJ݇w-W ff*\*\U)k\x32jٗT);c qUK ƹͯ$EKar,6LĜ~Z^C7E}M_&Q|  xUy|d0OIrFFOg֨&daޝsw ltrLZ4]r^vMؠ ?!x#V!wIT_ځdCD>:Թ{o);c)HEE̦UWG8/B+ kdx׉%KY02HFQR%bNB*PL.Do:`|L)mWZ%Ũ 1{6(T.J$1 Td퐍)z~/"ϫNۈokyaP}4wNj;A=6$ϡ֔d.pcݦW4XS,cQ2 (Ŋ&MDRV!b\}m-+lRןoGy)F']cꂟp0[yu?{}_pxj%['['Y5w|a~Hѻ&LNct Mr'P'CMp]|Rq2)Dn8CW*9-816q֢ YG eR=:]R +59WP7C^o\`:eP#'BЖu:bKvX.ݿ5<|cH N;atBd9T}A߾3Ƣvd-hd|蠖_*ZbL9eTު*Yb,Mʪŵgk_]#[)ZW򉐥Aﺯ}zq—ߚɟZC"Di o<0|5G=r"GȌ)FgYH(/Q%9'J V\@;;]!gkB9E:%@%P)doD뢲!r/5'_ڨI&CtqeY;󧶢4 XQxO*읹mAmw4=ۅ7ozt6iیKOBm\<Ykpb*[Y޽yFG2'ql ̴ufM/oY-/ݹ^*[C1o46- ۙ~gZU{45N ՜~yU+OzcbӉ1tuhrasga|b8'zZS**b5;& r$Kԙ|c6q׵1YǫR$dgIf 8huNDt`<r+X4estݳq(crpW1d kU=) c!I=` L :mPr^ED٬؝҅r@W  `U&,s`CVjꀘZ{Rb!/}J -*{=+xLW!b %\\CRbX0fH<9rіfBA!ukHm^Qܖ\m(.ܟtњ ^ars0`MG%g^Kߔ)X Ta|(]=;C-׊ f%쯋1' ظžDxJ()jL:+^i<72xO|36V[EHCFvüB\7} 9YiڃGyؒKiX/repڥd6K=Ǖ* Wru\_d久Ԏ Gmʋ1^ePOo:nbQkO;N//w?cTxo=:d9+E0[[puko޵FؤkgIW{|pZNǴY[Swӳ۾/.LF0 #,6x:͚huijd~nsᶖ튔[#i1Z "VfT@tFb88wF߮@z1MX0M$*GԆW߮zUtdt,QAcAc5?w<8iN%T0bλXKɑQReC*hefW6D]KN^BlK]o}Hg:q`Dl|,"o]2k}8y< gJȲdK$>& JXP2YRP] VRvb3%gioW[8f^aſs|.9l(4@.HY.6` J+2 ^e89Cn'HZCJ"JSES!)Ib119ǜ :4$-mF/rH82,c*Fc{6K=&TBHzLU%1zL@I9:5dm-$"Vgzj[=vm拣$ RAgq"m(Y Zۤo`Lplh0]=Qs2S-Ѳ:1z!RNeRh 2tZٶ;n\]ed{r\f&ˠ-  qlU"9[w!GS 8JY&c^&)+,xoYn^y-,}-J-|Y.%LP(H .iP%mqZyZ^wjgAVƆDNʫiⅶDsvhm<;v/mhFQ IJB.!Nd.god HɨswP 4/Eg ??09J%?0j)R\rJ ՀD%VgMNzkY7kA'*8D|եD5A'WYh;QNʵL'^Hj`d6X:z2E$Bf8Bz#L6ʙULD[[w'",[Үǻb|}:69-&)E) ,DIg.HN+ 𒓅ٻF#W:dN>pJXx’F8#}3 x$I ]Ϊ*3+ ]/Ŷ=ز18d![-Agx7Z_Ue9o ~$AnƽoW֝MxVp.a>y~6jwJrμRHJB.IOЯkw;v'љf&09H.Jֈ^"e 9V*r));["RzYPEVBHZcL{ 14xyZ. P54nj7+ѿc0鍱}  +|"+0%-"Ji|ІRynA$D.3'Z #gu!)b9IkIobq: hʨm"Vޡs%6h>i୍c!TĜ7e5rH$a6m/J>sIj%5ukٗrTlFeKG$plQ1L1R@Hy{p0"s_ gH1нdtd>iCmfiⳏ2[mO:n:RQ hsI~.ri?_(VSgEsh'W8b)o$W(߽oX2O{" '>vX7^CQGiƴ[֒V X=!mwm/i9k=jx5Gn󒦥E "VaRɌ b_zYI25xa{KII2#i%vAj%h!IFtPå .y=!&ꍒ{Gc2>vH^΀3HH̶ƚ5xaSV ٿC}^I1u.֧a<,KRKyނgɑ*J+T Pdߗ"y_?J ;AUn!~6& 2B3t:VR< ?Mo*5>3x>C h|5QyotmP<\Dm 6&k)3yWTVQL4*%50B;ЯJ E1 E<:Z$$㦌1f3 ,cJrQTqp@mdl=Y=,l3B2  (ɢR? =P:Ƶɯn4| _ #6_v9y!֧[J%"kZu`ސPs= a+JDlt;blf&t9L۷m4b8ǂڭqǾ[Fmݣv/nTJ,lR+`O.MfԶHog<|[[af6O3rG l#E\:ޏP|+i!RHNϿ፡YEJa\*-S&Det"xe.6NF},{]3⸳D T #(t$҈yaڨ5g6'gJ:S)βԅ[#g,u>Gn<6i̙dbtTqʥ4kIh1sb_rZj qTFf*S`ʲLUWБI޸!|^Xb[}s#sۮ\xX{(?O_)1eo4p2cR#ɴ_׶BZt yXE|u3L>y29`سa#}0r5}QkR=J{K%ϰ/*@Yug7@tϪ#9PUA:w XPU&@A\4m}x'y9- ,P64hw;Ӣ5}V E>JxbO:/|g~=j=}-S#*2^mw̺֝/tItpyZ4r˥Jׂ6j7r-]څZE~ڍJ p%{w)3l \!vZEJzzp8'Ru`JUg 坑 =\ETkŊm5GȻ6݆ʶ;>?-[UvzYcc%^'?/?Nu9i#KJs\e+e` 6݁i Fj4Rz~0-elv@bݱ!\tZɏJzzp%Vt`&Yg U! pJ A-\w\ G#T՛+-Cp(ZBhW dtH*EnD.ٮ\WHs;WHWV +$]Y) 1+$uƁ}npB*ꛁ+ҫB}' xle,]F|%j7*LW}4+%W@݁+$Eݏ+ֲc+Ұ"\XIm:W@3pHWHWH=\AT)*қZd O9\s͘]ͿøEV(ؽqߗUV uj  󵘌aBӳ,$)(-at->*k^)B -:20M ,x2d9؋.KhYa0?<U7|GN R3q@6)=w6Xl&LͿQ,h _~l sF:Z&*r)ޖI`^'s WgkHJ(!D|z5ʇp&+jϽpHX6sK*u.,'_]ur6jICXZ-E0]L '%e`Dg49$WwZyP)57 mCpj);zR)lWo֖ԆBryg+VPIW{z;p Cp~5+p*JKWo\ HMΫҗVMMgȖ"x/n+\Md]EV-zchtVRJ}.# !*a^wKrp%<Cu_nRP~WɋB#ͧی`PE9ZrF@\`EO7j9 56H,W>\.kvns zDa ws<5J:4 p`iZ`'o;'?e5_*<(wS7_.\=ShrsAɬhJ)E_ݟ=> m? ٗެJL%]6ES7\;l:+[j*:]o[W^cX`?mɶ Dwpϑd֑eVűlP䐜!3phR7-gͧYDҨ๿OW)F`k굧4bkY-/&\}jj)O܁FTs?o"@/\jG@tUw3⩗@Hi?nYE+MD+iArYfcLц}4?%FV|4 SP+tmp&X/sHh8#sh8`gܜY))xЇOXeֶBI&qhI| ;&hKѪ}RKiQ  kR{\ї[G.Xש0Ci'< r5$˄yP1@+5ĠzgIu,+_-$)#)**E Mw>"OF0 Yh|¼&knr mCTzfɒcfm]r0- d]XĜp6N܆t\TYjƨxc1(T= t֙^8ZƔ+)$O2vD26A(yԏ'-ks5/\mh$ma :eoޕ2{-́Ԓp?('}Q8aCMNηZ omÎӰiOvֱv;T}~wr]jQ-޿^Qi{y Nǣ-#<ߞxay?f`V1'N{uofn~6O.+6&m7v(W`Пmƶμ˒a:7`X`~EhHz?./!V%[CfZԖmR{m'HHaٽA}>Ssije/]uڥ/g4j51CJ9JH23*gh齳2:gr@\eގvF1YWϏ-mß#A=݉0;Ԫq_?_y?G ' #HnyrU N~ő ytUG1䨳3 +mqt2VEp RqDө P&ctZ, M&sJ`pKw!d;7ꌜ=l'1@u1tDsYڹ/& "wlE;6B !Ua>{הYy ]=|4"&#tFixևW I=$5X\b=Cqdt5|cv \B* ]{vEg=9dzBzW!nV4w'4K-ٗRlUC*O1 ]z|~Ё:|dRʫ,+̹{2t ]'Cב@Ҕѥ1:\Y-r"2R~"f%31 c.:ST\TzIӱ˜xȤÒ򮣱QլO5M>^q-F*Gtqi\z3O{Tҷ74*xmU/fuq.MNҦc E(zKAߐJBCE+*]һMotUfeЏGJVTZpk5_򅲲Gk%wCU׷|}}x~w.w*^Z.VWKe2ijdj|Sc:Ԗ5dJ|S)y?H,.=(S0" /b%J *oAUF1;uF߽ޫ 5_E+'j)tg32P*]kJ@Ó=Y)Ä8حӍߞLFaN"0%3`/*ajU1页WZ[p=zbs% Ր>K@XZ-3(U`iU*0ΖH 1 HUFt<6~&XnO4~la݋w4"])RFW^'3)ft7)92q? _+GRA2ZH4Y&-Kbyp@@qE$D&gcpnnD pZ[ {5(5t^1+^>W&gړ3N֒7ya+K"ӋofwuIm}Jt<]%O2wE,iq3Qgܑ:PSLp%g5= )tF Z<[taP[E17`[uMh@޾)|Y@߼^_!aZ.owd1J4-ZWE~{>^UI$g3ا5}[Ǭ\vlixG[N\B&}Ѣt)^̌gmyc|nƋupvyv>GyY/GvPD6y>oxF;X[K%4lkF7ek30h\Q4eN> >-zzf؟w e'ZmMi95ϟǓ:+|JW9?qsqm+*RCM.{}u]v0xuLD )_~|>JME 7PC=~]e^Z~&16~iUT*Un ^B!6p%VpͺSG:$? U̶wo(I ֋?믤:e芽#HlSNelBHjk|"d@3n-1,h|?4g/xiפ1qm޸%c$-IP@`A0!W&D.-!T'PS/d}붆xb065k IAwQ}UlM xz$+ڞSϿYY=}漁 Pu1#T/?ĥޔt8bJf"6Y` +x&CSI yy5uy@ัw"/b>ʕ{~>xOUxvl+I9}mvjS:;f-vgCZ8e uc:n~؁"f$X9f\=WQP΄ߐ4UC⢮;Rſr) 'u}-:f.#l6)ltP aK>wGpH%Dbf'J9VAD,r9) Ac=3rn#hMˍM^{rc*}RZ܆ټ/w/Loń:8'9qur$a:fF'uPqNw<ٝ75;En0nxn؂oOG2",EMdV͹27 F]DE`hpRNB!V.0tRfST{QQEEGET֐ !Dp܊,!Őt`b"#FL*Z7Jn,L(tHyL\Л#A@H6s&` c97ź|L@ (|shlnلhPXwy-).qSMӠus«WS8(bbrkP+-Tj{55%UY5u'>b%ch\yXh$ g)vVOGm'CJy$5Ze!%sZp(R4IfK?^@n%mŢfɄ7GCTH]y!@9QH\7t=3rnVzLξд>6E/&H?[eQooFKmh%J9x.Ib@?˒f$III% KSDAA82DlI*I(ĸRxTA:';rVHaJHȝ*-Y /JϤY;\ver9{݊y`Tz|,Ȃ 4c&A|JDesUhUR x r|.ESB:O%(!`F(Uў"ŒwtiP*:Z0tZڱ^FUAda"$=MG|d8R"$!dK&F'KNO<˨`MkAdKs%,Y8:ڊ牠8`bDb1ug!JL==E*|c[ƭ`RHB kTRYQ h%vQ 3 \LНPlcX q]Cw,k4Χec0hO{EE*P tH8=o-~F5s%ᤆH_|w}W͟AQԯ| 0|a{QmiTlv0֕}];ZumA~َL뉘N?-8quUƚ)5S,ƋdH>b:|}d)zf*^zXJJHmSO@Dw6^{QH;?xd<ߣKU.8Zx~䣬M_Mp>OD}|>ͩ_~eYnI,K<:@uF؅ Mhi~iyj_$:cFʤ2zlH&/(4FEIٔbDt>yK2R½L+"QJbбfHWMİYw,O*n :A I0:s( ot"/P_%6l-G# p.I~ђQe$^$޳3lIs@v3-exrtɑ9qaŇtJ*ޚݞ(̚t-Հrku~MQynA=hu!))xMS1We q/UKdŔK4Dh :T $ BZ*k(=-;Ed"IxYA526~dlUaa+8 ]c,t#>(^~ee2c}J~Ņeބ=8-Ϧ_[%'N"aJ. ^JLEb"`~B (D28W ًE Ҫ¦vIPo'| eGFbF~Nyb j7cQ6=2؝ٗAșM|ՏR(-8f0J&d`|` ArS<$ r_4!3Gk *+boTȲsN52jͺ{~+RT}1"GDYwW5{e]T F HI}Wƃc7Hޢ7PLϔ2Ee7Fk Τ.`UI%eoС^ dnw~D|y,S:lf%Gq18Nqh^JR$λH(,E'sRm #.>.tx+8<ܴwo<|:Pro="'('~"wH+M]K O@:RK;Hl ÁS\K<=)SQ严&N@7*?<QT)mhF1ō]Ml 봱h੺61_}'P z 24])2ͳ>E0޵.HJP"_j99v "RP`QƠ}b(U.Uy V_@.X !Y(ZN#JʊзPJ}6 %>J*I(GRQFpYIoudŋsw~C)}J1ԛv;3}-_ևZ_ow\އςzއK﬍%N'H174ћcǠtjT2ЈՖ7f< O ZcsVFl^߁B&,)0&$R"4 K1$,Ux${ũZ$$i E#cn-xլ;Fg˪N{wKf:[`jw3uvPwƱ <&!B!f/Q˔҂y+ 097ũxb3=#|nȓlfNG MDL& T1KU&ρ$ ^̈D*!D]&D|[E?XoR5{(hA*ש{֞4 XlėZx]~_薋W;yz^4Eirh|$U.Pb#ƏIJV AҫJ6h;{ Rr8T(,z)Vw*q]N(2[U/z'A?qLwR Xuծ [aݿ/ʭurQgk{l&{ևZ\Ѣhqסo5=K7y;All?z9913n65m^o W}Pujq~g]C jhCjC6MC5l>Ȝw[P-{kKpV_ͅI%V\Mb)1׿co+Bo=e{')T&.99v;xOzI˿][_>;_\ 霍j58ij_Y|+558QO+ٙ`]4u1NZug͏)|:~g?iwcLɯ>%blgl]: E+G|?_Mgg?^^Tj:okDտ2>ɯ{~NwX\ׁ~CJ% 3 莭fBv4C /^wWcL5͛++%+0Yooǀ5.mRCΖee΀O];@v`CNx6H0f ǧZ92d;'c]tbe 2>/=d(4NY#>F_\ηW&?|~]e~>ܯ)Hrv:}-ωv{>izIQTe.]Χ#?q=ڜ^{||͛<'݆} 񁻴J G{1x7-VZį,~Y}DN8z_-ja3rג_x6c|R]!g%$>8] )$<䂲YgO-='3>f+=q"p]*"ʋc "my-hDDn΢Ai*!r G YFxKX+;Yw\g)-#_1lwjyt5l0\pĐ5J.;J"ŀA0d%wHi\-7@}N?א>56lBp'm2$DüJ%6LʁHQ~']-~Ek""g'TT~MTHsZ"]9$ pԫf6qY۾)~|6muS0<^V-^H'к S=f(#ViRW[U)xȱq\Er`XJ )lDOAFj!@I+aADC7؆㊌uT4:laCeXs [,g@y]:@6 H(XqXZ7*~UXUw ℄UHDKVGXZ#ϛ]7{䶍e ?n6ޏrMm93Ij&W ֲԫ۽ )#neqYVHA\sL_iz&~tP]}2Ru%6,] ; GIz|]hsTږb.z"e%jʹ:73ߴ8fOFUЦ'狓6/yq|w b~A ?x_MqN{m381V7ZR^ۀ΋eplY! ^6Y#n4g;_|ji˯N hh KG< m]ћloV8Z}qha%ӪFn~Yتӕjxy6.zPysX&|OW?TԲ}%S卐 ~!V{X]NuSg˳[v^k t|t vcqbAIcn\4axrg֊Fv="=EQyw`[hU6p97B$lQ415( Z&),N1&,+D1aKF$GQőxITid,֜XR#X cS -P'-C;-Pxz1_8b3j{"NA*HE"7OIkZ؞ro#L8 =32lr^:mGMLX؎ Ô~Q9O'y,]ltuaj vk8RچCX,0@%KZ!axZh(l"GpQx"@ 9hE& W!Fch\Q ȯ aƨ_.x,Xl|MaD4"mb)hQ4|h 0v.V8$# R )ePJm Rpj>($P+YLj\IyR[$ǜD/xsK`34ŗNx4W)4&0"j W 㓧seSͺ(;$$$ezMxK@2+5:S"*`@$fw;ݖ,,dP꭛q댂'xsI.@'ܕ}rץWMRK~I#%}=j䍯,_ۍ{j򾵫}jr>P{_޳FSiwM^ߪ ;\\d/\Hwt< #ݯI뽪7'e-V7^sWw"u,W$*$k8o}ku:6d]K~U&X[T.|+_L\V_z}d9=;Ez#)jlIAQù)W7|tӋ)2[IE=|z>-t4AˍpϰM a"X^;UmhSȂT5/Sh؃ߚ*4ͨ,XxQ}`'WBP;-YͰCuJNiriϕ^NgΞB*sgs'AKsooM繣?.hԾ6چku9RO{qu_+~V(tg3F3J[Mǀ];o*i0`Ņ?q`׳)66+'D,/L#<#.k\WS^go kؿ*B+ZIY-΋eFt0B! 6Yn4g_|ji˭Nh4@^h\z}=jwmam]J i,@G:1Tj3k#}N*;uVS"ɭT@lHd3p/Sϗ]ρg9PBKk pZ)Cr*!"锨G#,Jrk)<@;տ߿&=kB>NKO3caz]]gy{{"Dž]!޻gj}qxt~:dl%|!AJ'BH׎[G* eig<'BOՖn"(dTiEpRN\g5B A *1y9E ) va^tmꡛ!nxJ'v`,| ן}};f욡1pԹkK\F5+tqp`O,UƻѲNc9U+|7mor;oƇuoiG | ZǕUWbj.u̬eRmI`ݟBwsVU_ܶ3{{/Unv-a DSƔ*7 mtR)2vي l87HŞ4fV{)۾@9HhE%!iGêcjcD*~[FQ| F(@"}H.x%($8ڐ};yg'ٱbv̮d a&~l./d맸AΆUm WW=Ӄ 5Hi[Lѓ`"fQYr|j&mф> t.e@0$eI^$3- SWE-l`, )ȸ)3)䔈!YZ6Ą$JF' cJaa'_P~Y. ud\b,ZQ]ARXB2)> ~KPNB xS(6IZ,(MW-b4Q&MOl+u9 i<#A¿d! "%-mMri7U3_5jsB 4uI[w1hk|&s ½[T8MNwgCo⨏>"Ơ M`+dX2ۘGف qgل lF0D F'uBRT 3 RmXxg4KP,"FIfIIY˾VWڦ)`l(Jt48( :˛':Ύ~\ Y|jAy׻zޥ0r'pΘ|/>lr:aSm PI<>$ɮBt:6'gD0-qEhl*zQM6B=2]O (^~. 6\ FaY~m YU"Wd1m|!gx,|›ˆ; AZo*!vJPEX $^D-A3ˑh%@mH& X57ӶRz~JIϣ7 G ΢|h QoL^L>FSvޢ%W`b2$Y% >0<`t/?}{~q½w ,l`02?|kjo9Fsm^lm>1UUh?(p8gFiyv0Jd~4 Nq1D ?.CRB%]vћZH^ȎQGzH\2[q))ip(Fyu9ƾ)#1_SUG6Wv'azx|wJyJ]u몽#hXp*:hT,hz [#KܴeOzƚ=PǿeI*av)XKZ*G PE+'0zI 9lP8yvr@8iEKz;] ݮB@{K3HdD2Dc0zIĘB/ ɮPéV0C̽fb hd̡82J`T)j)*M.HCeTH*ΗFKCweysy7{Bo#5>)oc7}R ΕrMR9Wm}E}s7-C{>e_A=#`a+n߾]}m3j2eW*+?Y+%:y( Jt%&h1xȸLwRubE=PZz 쳤4e,xJ˾2Ȣ)Q*Rh,0M,蒷!ݹwW&Å$4q/\W')q Hy4}533 y[.ȟѧ#~]hw$Hiu~ˉoJY9h/\=l!9(&i-s)wgæMIc#{ЃYRE,A R(\dVze*>u}D( UB"d3RB>L, xBg5VY[+yڇD;WLqبD$="Ƣ88c/sM Zg%ݸfʘy&|y͔/..yyVy&ǂW?:&M6s#ܴ Y%)H1Z{i瘋LA4Ꞿ ܸjs #%iRXH2AA rPE٫`9- Js[?CIWLTR$Cur= s;7 V' O&v$TF Sg$6bJ'QCeVMP㴼g [I(^՚'l j~me}rPQl)ƞ 7pjޟR<,?(Lj*T OY]hըGu!uZ-QYw&89k=!֌T;Rh^)*n\y}~5:OJ i`@) Q 'Nb@myTb֓#F CU NeM(̤挽3' ŒB3/t[^ I;0/EWNl8 MǟWα9$\SNLJ,$`PPކ셤&RZUDgș'( E sa΁}{#g=Ǧ(µkmX.pFEX$؜`oyH ZL\op(Qx){,XfOMOUu}5uڶֶ-}Ì֑,s&{,!Qfi'@&0cOhElPUfml Z4WFP4Hk20bWaiOBLY$Pd_Uևٮ[P,ThjP*kDk^#nxiHk(MwilSSBhdr(%#I=A49[cLv_@黩UR<]UZk/MDj ={X9Tf!*-E@xHeV橢y`XVq^}8^Xs<{"du'@C% B&Ŭ`mI bȼ![fiDtIJӅ^֛a8chQ2j#@XDVj("ݠϧl qolNӫ񇄦U{N7׿> ~ox9o+.N '|ax~7iw_SoX.&QB/i|)$,K 7_>/0:.?KomZ2o//ի]̄FMZ4&).h~><|{9-Jߌ/J\ʚ?BYskR EY$^'ZMǗ{ɁM66xH!$I]b_/Wk,BޞONoߕfНa'@41&$Y5$YkPNg=-xyC:ou\cTZ2h!JV72-cIeC#M0tLY|KZ%' 7Aga/z7+Ad,Ϟ~>N"xd5r1&aû^ 8;օ --Wȝpb#P?6⭋Ӻ;[~HJ-SsS{j.GT8pM>o+NF-s;]2dzw {&Ke[fYYnvޒq5I o7cf 6qO.tGtl7u8+ 53ާn;g>_W*@sf9 пҥl bO :JD`ΙދTeȢT_@wE P<&  *IB4<+kwQz!53_N%o<:*u£U8P*ߟle{;/= FSe4*kJ*eх&h%}c:ɧ'|Fg]koV!Z*͢#ڄ,T@fິ Tn&gx}؎ʇD7=k<*85Jm˄&K/m{0֑c17h ˛g) =|R T3N5$ɚhl[(UIm-*ݻ@ʂpTGgaGTzp#.r}ForJ{{;(.UqQisWǏi[=GXh+~W->_|Dκ+u#Lʎ+C8j-06nIu [oݕIL-v]e"g4' i|EAkxk:0e8C?x1;! ~=&xSibx9ǑO8j/KSqTZ pNrRf$UhR rWƋӅ1oaBZ=HH$_*:wvCo`.:ߋEZ-wp:we<}^/~qݙ: ,`Q>3f {J|"4r`2۸{y2 n|"%Ed:@zy:x}iJng!._׹6NOiߜFoxQ r7QGOPr"Zt=2 Zz`-=X(CR A@(O*!t\,SqʜzXHX~Ozz}eѵX|?)f2cݲv*/n]xG|_GD:Q#FC? at 7*g IMb—GW _U8-Dr(]vA2VZ2VEp=&| Aw)JNƼEaK]+t3R.rڻ?{?,ιDyW I=$5 uiAKy yBh02Pe`W#gig}r<yMg+_OyΝjqE,,Tq>lz]# GoJQF%)dɩd=rc9AhK1Zc$ )e"MrYxfZȶBn汪iY81EeLb\),`39($kXGc0j_jǟp;g1mv. _EB]3 ^ཛr6W>Z)vv5?+umWݴ|=FI+#7C[^~xv![&˵h/Wm~MYˤ]I ][casufMrf.59zhTFhb6\Q6a=v'eGl8:dٺJlWA`E+MD+.ˬ}:9(MW0BȒ v} 3`VZ L^Z8 99rF"90 ^%O-rm^#`3çl\ê,iHnZ;m~%6;Mpy5/~)tj}!+Vq+ 3t5턶DNfSS2:*HtFPd H{gIUj%x$AQE0=z7|D`vVo̗i6L\ERؤp(h*t|ɸXYyu8E,R^9B։*@Eefip6FAUZqu.c#WX:71JE0^ƎHƦ2Z~ꋖׁеuVmp^Uw|6~z>͙Qڑg$4J74b>fgO1;cvy6y6Q쐴X!*KVd )$$.wrD2TN'p9Yofu]#>Astp#kdۡ/Dž)TP:E`ׂa22WvT#g |L:ab|r?7aVOoS Y 6dz\k]㭵l4`ͫFLo^L۫j LaU>Ycc li7T`drM{kN'CV}U?dW)w[+!o﷪͞0+diܰcsSL6yYb~}ݛu=|byu#OypfZ?JDTR.J\*)j9JʅJkJ_a%e/J~-$(2Ji(*52hR؁asVQlxgb-$Q<*K]ըS0IgdVut `C6i" 7'Ϗ^y,7f +0(}W Ӗ)SZQGE{(=A ӣH[I Aau¶"WAeGw"@P"IUFT~+)C Noh="e4pub1>b89>(HɁ L?' g-$,̖%1<8 "E&gc R]AZI)BZܻUN+?W|.+L-aZ8[oϕ'rcŒS$]29ceP$i/<:2I`j3w'z3'~V\Y;śg\2q2a'gȒ]f3O'9; àh#<c2o"u:\CSvm>ѼϹys:?C4~l_-|Ъnhe_-Zm~zzjZs} 2.rs}c޵5q+ῲu^< ~IʮSqQR{vi^̲ #[՗sΊ:Bh jQcʌf{VuL~p n#*—0O++reW Eh$ K{WlHJ>Ҡi<ì!aޣh*|4y7B'[^; L6TV+ĬG/L4ւMC=ud?WE{99x%T_T*썫]2FҘ5PY zeBb.ZlKm*sHΨ.vVv/qءu+Pf qM:cPZe!%sZҺH=CN$3)iFi% $k\m-9R!"|i >1/<'UTXI[vȹʕ=t.[L+t涜z[-.,~l[{]Zjx0ZɬRf<)Ks9WᲤ7Ink;y$}ɂQ/%.Hww d>`TB;gPM1$-߳ 9٪B SBJ""wZLd%(AWI v3&"˖5r6J݄aTz+~%m2 9x**)ZEwy:֨rQ;Q̳P%!uĠ #THVPby|x ޴ m\ }IJf}VdRwA1ZX&sɭs7Ck֊p\q\RO9QQWG1ۄnCgɏmx=C2B)gsk 3y=ߎ} c1J.ˬ}:9(M!ƒ1,y]I`JB^|H&X/sHh9#sh&<'xx5rƋt8g>\hjg` ER_ r1\ U >r*XU&$ew$֝UT #Ы%͏H! ;$4hm}RkNwv BZifYE.(9ĢN2EL gs L!m^)|\j]HP8JpCPE$PN.elKZg2\I6v2C2v;^vMQ["kƠ(4c_~}G{.NS}~DWV +p3 k^o"uO͟Oar[Ӑ[h=|PNv]Y傡Vt鋺#1z%ڣI4vUlR1NՕ?r:*Ai_OL3*\)`:zXZDP"Ztz]tꔾ(CR a*(O*!t\,Sqi":uoq/qscՕ VX_|WlVQ'| 9me'$ދk?Oa%~ƓΜp+UЂ_%&X6a. =3䨳3 +-J@"Є}m)8ibgBQґ~hU)%w1a iTd.A N"w.|y P\3:̈9z5_R.&'6+.2-% OW͆;ELFHK)a&'O*5o &4 BPu] ޠxx(~y̮qB*k;/Jk- $7y|-}Eq#k)jEs^iϣYrVݜZ,/NK5|m$- CLW*:_yEJRH\q4Ns4 mK1Zc$:[Rj,B<;-b@4: \t#z :eN @wt=;7zt6lLGY6%r-i|:R: ɍ \0^cZWԺi]M[s8E{]ͺCjYQjƭwnyol|w3' z^izrΛjU}EstgͅP2>|Y'zZ|R~`D^JJ*eTނ3F1c;FߣlOZ"ϕl5 uURVR0:wB{( I Ow{Zݳ..T7+= :|(%\Z+L.zܣW.Fx:;lܝc^O4 5(2NXvʎl XeLW~8|$^b#7IkXq@H \yX tΤ RreB? ^+gRA2ZH4Y&-KbC@qE$D&gc RZ-+=BK*`w׊Egx/^O"H\1&1Nzux~3+X 4*0> ?\0u2R[2I`yȓL}]S'l\=H()&8㒉3wl,i~xxz768S_͞Ϳ:0(-:"ƀ̛HY :KZ>VQ#N y7 S/Y*ʾ7ދ~y28_Q%גt>WK`F4&ff9sΊ:Bh jQcʌf{H?O^_LNf?8Xg7lgKO'ӕ~+"bIup=+XHy$oi4haVY0pTQeI>^z?OMλQ[?|M6jfE% }~CbLW>.V:U|Mxigqg'Ď?|xwx׷oG7G\_߾SZ Bl$AoM >xZ547Z:кY|q]Sn8n m'k-@R~콿z~폧7۟r<3I+Ua$i>.j~Nn{0HA< 4مߔ۸ԇWʷՑ·' F98+ol9}~5?j1(\P~MB`X'F_}+u{G٦jEɀfZ0!#c蔺h|?4'tkҘ6fo\}1$A H]YoI+_v3eefD^aaf> iMjڽCTR*e*V_LJ9A'oBUلT&VʕOV پ[>)J _6}1ڥ@]ZX)K+5)CJmNƊ߱"@* WWq - "F{1޾L+ G^MhTل7 2o 2kؤKy y?FBv9Xx0s3k[4_ҧ.5 OwkX&R(j?_}oğf=c)-fa=fZ}Ԏ[{;g2x;JwͿmRc:LuA"8E!Gz9*˨c4 ([$_@MzԩL4/f?P{g@܎WcClB{N]Q"nap9JH|AlT| NN-KWGClZx1 o7[_ d1T͡7ٷPmQdŦU}̄;CިWx+]܈+ mŵ)mwK 8opemk{zcyV%- wYo﮷yC_ᕳ>+vWkm[e@OGj8TJ0ߦWR{yp:x!l-_lp:4eo S :Lh-Y{Ty9 YYUaWL1t!rv9靓!.ܙ!f e`B:dsM\1 ^2< V* m(A(s쐻 *oVkM\I q*SMS8%S{$f*dE_|,3wt Zfx&Qeos-뼌Z3ND'fkC}D]-ݝeobor 7(};bۂogʾߴ&Y_S l|0AI+"waTBXhdDCׇ2gU՝%mu12GĵwQ8#<%"zmd &GmJV/fBJ ő'#nU{~EGHfC,E2P֜@qii@q}4^^RtgʛF#EJ!d-, JԔ3V(\==YYi)L@VJòM,geԋ,,h5*{/qU^yUyiV*ia.\}g;aJs_+t9y&$,C4>vg8K*yЉ-bNU-֨kyT{^2o,6 .SVFGUQ@z?fiC>v}U6ɯvJH!+UWed!7FdxRJ&g_E`k : *El="1M%K$fAI1(T~N`-B%벽y Q9XEd}22f"$ztJGj֜.*_yɌmWU/[W~M4[Z%$(]a̤CL<"De9il%nԂLdV AdilRF)*Ygӏ#t+N $L9%$ܕ9E$A6jUZa>>Nz@UZXt"kBPL8(N^EA)L%).SڛBb8:E [M&^ -nql@ E/,jx2=$呠X0&sHDLmVDmPDϺ|DTʷ ŵa&_Âw[@͑Jj9V$bH]b+5ar+QMBNYY;yVkp ?oҴ4[^z $J**!O6ydژ&q<[-Z6*j i'LsKp JBnI!YLv<$£&`kG!< ,Ÿ p.}r(@:G6@F咱^*quAJ.Iu),YrBe QcBk+ r,j9 mB7,tN>vY8>;kJivclra?$;{| Eg"(ގ_O'UK/lԒ,hXlc t˒ˠRE D˛z7Əy^8t$T08叚; .:M; ͬߠ.榜fp}0ED;-_0{M D)K$ ueYڼ5Z:<]YK:T}?Jyƽ '?zO_*yD 2H5DQk1lCB*Ǽ:YsZ/yDƬ 4?^>V/Nz+[1{E*TyAUKy1UKs7LPǫSkW.R7UJ-@dzF$$F
|'+:V2h9[@D K2ʤ6䭖D7JZurӉ^wvioǛIrde;Vf4_\i=gZo{<C#|ѸnԄYlQ? W|pX˱Qd>< s:D.g`]p!h:B&}6&ʀ{B5]Oc)rje2T.Sr))anS9HdW0qY,8]΂+\X=20֜#ˁ5]**#N[rN>0"ig `\YPD OI22'm:HW7}x9ewo19xHII P0eZ3jj]=r/7JJm\Oޕq$ٿR`S kGf!H.}#V_s֝=DC =\ ӅLosݪ:+lm#ܿCxz൞[j_ @&o):Y)<)-:/C zZ5+ehZl}C1vuFFQmnvwA'+K5@(p)O⇼xƛxFGP9b#G,b3$SxO!"mjG?}yi>x]o#֏۷?,W|Ǘǃ):Ձl0,2U\4MNRԒ2h ߫4hJ\64/W7.)q<%)| obۍr9LͳgmT{*D〚$8%^rVlÕ|,y)~P{jcܼl/O/f HyHN* D:b.'jr|?C%Io_u}v8\̑np}YVLWӀC'iͪjjg?0#<ŲW1:#@t61ﻑ%񺱜0' DZRf5^p_dF2;zs KqmYIYmy{aޜmGSg}tмy9~Gůڤ ~9r\F/54r)M|* ]NluaG? @TҊ׻n8fkݔ}01Ok￟LpB[d- [⒭=J,sϧoQNu;yZo\3rpk[w;;I)owƲ8WbbW[Kx`?o:]=ōen\tFt>}FnrpF#d.Fe@ &L@RR]RPY$UrO}Àe=ǗT_?6{`K]i`4 $80YKf"&t9:tq>t^FxGy G58N{]s\r KxROqOJ9v{/QAM܏S~6/ r4?Xߌf2k+tsanF{d9(RFI/Mg>=u壎FX TK_1C -& 7XW30N3`#mxG*W<Z!i|R *55~fowf7!$QX ϥZ#M%I,aI\cݳYVeB2@dN*4t;Y Y&L .J>\w$e>ZO/@ 2Ww $ @h&XK S zvBF0޳+]îtA4I\訪GY}J@ &%}0.#J )gDzCƙr\P ضnLֲ%$˔y"d4] %Mm)V'3)sl]5Vwخz ==/>ߐ.PSZt[7+aW.C#T n5^mܹ8 >J!=h߈2:2s_>tck[ֳd붉O@}>qUCnF.ʂ8'~0gD}JFy i N(kБW"8#Tw+Ww+Ww+wVA[~jhXPq!((J΁dd!zS c F*vq쎕_FG@x7Bx<=ǯ_{O<5cݹkL.'b\K,$ Ӧd1Och(q0iH$ةaw_}o胗_1공IE ,H53lD4'f K-ɪ&gV-2s qEfƁ 0y T&p]0֝=l2;> -G>խP<'Q;c"qVE3FtE8dcYΊet9DU:|8< /LŬ |Xy,*BvbP)CmT;5V|K/VN3S6ZMN6Ndx1< 7AIOs~v[t~2_[w3@v!uڠ #M1'FCMN29DE Uݲ%H BeFC =9hI=M; =G3-RTmd֝XV$P,ԕPXxvgde7Nk"ͧ._5 ~4| GXg5)9hFu \+C B$1xƤG AoOIljaw-@؎* 3 ?`UEG\xOُoTV~T*l-{=IV32ݘ+jd5<)Cd C r ML")e bh)rF!Tk4^FɍT`6u'$KDu{P;{&y΂d:]roM? tMdꪬ~mӇGjKj|r.j2+iޡ3QIE+kp*9@E;1{!rRHZ1RRǍ Sp 8;jDU*O z|e*PQK 9GK & (4:Ѕh=LmҼ:3-ާz6BO< yB6ƨ;8i̱,=@}1 M!G0&jkC(@SyK\=]^yၥСADo"fˆu>Ңy0`ѩ_oP][ԭ1S72j{vðذ :Y^EFWO|?c7Dī4:6r"fm=,mxA2%?W>f]k.ҧ6?|o']qI?>LALfudˏy䗼/&^i % e"AiД`Pi^R}Jo~]R*x8ZͿKutĶrgϚ/&+"U 5IpnK/A|9+ JM> GG@pG͗er?q޾M2{#!9HR&IJL.Ij5?~lb]k0vGJ"rd=IkVP{V)ohJ`8k./Hcq&Zb2BZĐO@ƿFrø7JyF֚y"xFG|9ɀs.|Kͯ㿟%TO,frq9i: {UFӼ94yU!5W|M.eTRK"7҄ڪ7JPZwb;4LFMw7N]Ϯv㼚vSf_Iu#^9abn\^PBϽ 4>'T EySkuE0`~bYOe=UzW!}O<(FEG#gX0I;sֲȤI"m+cd"ޑy^,l8$OE]xX}_r~V xޗ; Uc;VrN  wmU#b8pq7]` /ՒbTH*w?38$-$ xf]]UUW-5Az M<:6SN?朵g[”Q)2$>jBsW,xKQnI6"#hsOrrm4DJR0Hp8=1BItvF)ϯƹ,\=,oqëQ|s{s.&ޖtO*/8%s`|R_\ !X/E?YȜDaoҰ\@"gMpol% @h"miĮSuFΖ!H/)#)U)WpAްPlT"JJMl>2#I`M5$&v>ǡxl6l8H' gq pYTXd@ PЄmLAE{#AȆ}E~⋖w&oV1G22cLJI 2drskXjM2mF3ּX\%F{]1^vRz14h,@{Ђ_gc!TG` >5Bj1xUЙvy6;#yt0/x b=.*WaQ,~Z|sfn0|?hfWR_O;w82G=c=fyǞ\18'LF+ VCGTsvO;j/7:Ë *&2KX^|@B ջ -HSѫxBCz8ŷog\AS>/YRht%J" VzBx[ TwSl[*1U8!\a"7z ;֩⟬fs{$~i.G!`[SK_|l48L53'srfɟnՄT?g#^ᕗn*r~FG׻Խj{ͻ_p_`7rC/? // ލJ&otj&SQGr/~iUk?r?xZk&|P{K# [M]fڟ{=ȳak EVmM4콞KէU:KU頙nQ LFqdbѫ:^+y.fEO ӌT\mD3j9S3ߛ9v<*ƅA:DF3v*+Q| $ #F wV9Q Xq6y>LGfL U uY9%x%x4NRٽV3p)zEY>f{ͲN+_L>4G<J,7%YJ99#Skk{9*JqaI-YZoȥYN L }̱ @:Q"u A!aLu;;#gKͱ^q0]WGL["j괠`ޜi3nf%-ܐtNZeXTtt޾jӟ Ztc%"גV-ԕ.ޙ!!e>on5?]Owwy+ Ʊovj.R7-=^swsޛcx5nTwݳqXym3m{b'{:T揤~ d?|;$)8P_L)#/%L_`e5]ͩwRU@^ۛuFԽm$7Ds( I Q- 1ddL:8u`Sܸ&rƉ}M]u]7yMg\l዇mc|8Mq4mJ:g$G k>{!(cpX/} m*E%#^ n<޺b˜W^yF)\L\OƓ*6yn+ODtdAvaT) @XGBgAFapzU果zϝjzz/(.FE9Pѻȸ8#*C^IQ[%$6UMBK ꑞ`5`s!~('bgluyP\\6h}{cOu i 3Zo_05z1=^}$J-ݟ[*T%k%~$haq}w廳*k>1[uqiZ9=cpù-x)c&wjJ!P`Ev <@X@[\_ uM_ᖻ WT?SUT\vs뮏r-ݐHU QQ #=D'LϹ$+^3B Q 6fRD1Տ=->qqE4e6'Z @D"rUC1h.Uu🚵*HVm*Siaj3tʹ&YrR5{Hq!xQq/\VɊo4`2.$!u9;| 70'kuNem5aɡVDͬ2ѸNe$TN1,%"BPZD b>HНwFKez+P86sDh҆O ;>7-5.hm!W1U4/au.ɓiY 3ʚ9QAI_ULU9uhͲ48:@-XaLǝ˚,uBm*0%OAhôW*H \>(8D2Kjv/Ty#DT"u+Wp 7H(2p>Keu"ha(W^[8N4IA #:o.A)D$6蠽JI; RVTAprrUOTQ_s6ChcmlaE}.²M$:﫲;`ߦƻ.?a||4[aſ.͝xPcEޏcue8o%u2{gd)Au Nr/VSx pAmGlt~pW~'ຸtR2%gUkӧ7玧J|r0lvش4Tݼ½_@~[,~<𿕹<8M '<{> i=ތ육mդ*Eqt=3AH[O\oUOmݰn,ofU^F 5 G0bɇf7}]Mֽ\mn]+#{m^wՒY솇FBroQM_qJE>~[XnT5*7ucgv.O/~x"|_?ǔ_|78=vJܙ_w#@kn6].z:k|~ozoОZ o><!¹>^\:EɄ| fϼqn|F1}g"L2UF " h WH+]Y#RP7ie0US*(ɳS5TLF v{c&2q į1eЖ}jOH?R@e{60哎)汕B$ZjeeG$Zֻ^{hH>G{BDMsrtG'_D>Ye*KZᔆ20 ԡg@J9O;:qBaru#[j' #dkv˻\C)HY+wqE`֟^z|>x?Y`k#Ϧ}&w׹y:H%sֹW8׹"y$s1y{ej$˚z]҄d }LM_f D Om C^e *.@ (r))9-1"IRz*@}5Ƅļʹ҄cr;#gZ+tB!%NfMq܇cl}i|#iHq⎧GT\#μ6Bbv([Ur}/^Z5ٵe2;zb)/gُݔNij5g;g;ڦo4OIwRKmt[tx >6Gq N0-SdY3bV:.yRp.xJsa2[W[{QJ+uA̻A(%4J- 62QEQfXo >=2糎_\Xg,8! H.Eb# xAȹVڼoY1SeTG '8m˶RN[Rv].)yHODɮ༮2)Sv0qϾch?!u Ʈ*4OҼ^ \eS(-DtLʇHPt4GlL'=>FnnF09rb m)P&,sUVLhodHqf,ךw:,I+ƙ̛[T%`_ {Qi)]z8 Ξ#r_0T/+,C5xl0!јF( xFNF}lJ> C!p#?vX޴:1Qh=9!uW V,Λՙ' ~ңJ}]%E0"i`q%e!b> 7F)qtBZؙM|'M4- pgKPD&&4j(:z钏M0ں =ƒD|d0 |Y@ ErlQRh.$^ݠb;[1V]_FGgrqAEXtwAxIU_R/B"'ld$(ܰ#`e,G/( RòY֏EIt SUژh;]eZ tuteE+,?`wEW(fӤVɾUFuЕٳV[Z\}du 8tec^ЕiAWfCB>\l8v0{{c 8U?ЁhQRt:޽y>>XBj.WHVBߥZFx2HӑjLʏpԺ+,i9e[]eR2ʁN8Uc#Cy6j}FH="mab0"B |\]5'9{DEYEň a.@ ݇[t^ MgB6}iD)balLg%/ZaW*czʼk @?>SJ9Xځ->?^i:ތkNɬaB?O?A KH!UpJL09[TFαzEr~5y*ؐpwzٚ^јt'ȁ:!p&(AsH>UIpkͬod|Ͷg 8kdelH#<P2@4%B0?p5Y@pZ9DN0 Bb*eh;]ez"]IDW"UkyohUFbtJ1fAt-'9h%J=)ҕf4*68Wb*e)ҕT*P ]eƔBWVg=;xݳУo9vq+\h8LjDٳ+:ա]O8`M1tb*5t(5] ]1CAtb *KVUF@W'HW8+̈ *.<R[բ 1t,t1,J] Kg3Fd`ai /QpM1!>pF |t7"%2o A)tъUF@W'HWRh B f”BW}+D tPWHWJ2jJRWIU ]eUF@W'HWZ UT=.2\UL*=PjJ:A2pVʀ ]!\8Y/2J9؞]ΈD_]L;2]{vh͑Zd}SW]z ʀ )RZ }+D$Fp^'=7*{p^*Fɝ'ytӜ$=N<qhYTcң.#2Z.2J#vR3J9[Ȅ(|ۙb8iJ̳+Z[HzxQƊپÊƞ}<W>,m*,Vl ^>mH=?D*!-Աƫ4kLJ`7*z}_]1g ?vURcݿ,ZAWv/7ÿ>o'P# V82{T|U%ouz?W*;< :%^􆟦Xs>^ q'Vxgb8Ō_A q:AU|_.OmO4]bd%cF3X1\lޱx;d vl>jZ˻-l\Kq1 \.H]F?3KO,rAqn|GKUutoš]\np5.GN؄]1HuFHKރ娭żxp* KFH^Ћ#h#kgn߰=5#G=_p/o=|Ert=.FiHۻ~k*2;p3r)lRGڟ|γv՟hau媾7^O&;;PVO).&,Y2NGK. [$x5zx76r͂x}~7;:Np3?^pkw'w{{#7xjGbsek{c2w՗AC?Z41>NcxmJ-l>Zf [TvLQX19J,*gHŸNz -K|T\|t\|XܰVRxeP*%C$gASM"yO+o3JhbRơlHr.e 빱1$1:Iψ{3;8VcYOrqX*Xjy|96C/ޗy <ܑ'AIʠʆk'׹[JKc*TP[CYbcC<[|ʾ|owwuu{ahu(u1ypL{k_] {=jJZxm#xtEXJa17#XaP^nXtLQ$wAo!Z橏p()'ShuSWp^˼V(KQ?*?{|6 +&!v Ov_ j<]cτ_R9sldj}}/{h 4l^))֞-6ߗvʟ^xsq % (kmNS| :HЎxET  kERuʌRBr0T!D*$D,J ,Tr^lB;:3vU®,P.s]p(mynH/ Tbw *;n';gl$QeStNZB T "K IrkvT\ ٳ,&Kpb&&10 Cw6+83MC kwf꘵ڃ}≔x&4eʢLJa<%5(<* A;$EP@5= &D%LB\SQ_u̇yk~vqˆǡ;fD=0/Y&r^E@H&&"ik䠕D<$NwF8}(SЙAKg\p^CQRţ=i[E /:fΈ!/.S3+99/d&FZHZ%!Nd'H},H: )0;{b_3 k͇@aۥZ-[=G ,5嚍W90/xyckkgSWiuIpC㑲5#iX1KG1rŒ5&THR2ܥ gValj&=];2^|?l2ylYTtEF0<( [ @ʠ@{{%LjaC^xy{LHs]6\7L,h5T"=kknEu^v=_\l6٪dTfrLHU%Œg2I?`eK[-{z\#jD1<?׋+80Mřev\tϛj/[?ȋӳ#pgɓhdDb1O0i l@;-Qp6Q4B再{+gP RbPCq (2)z 1 dvUD,fx*yp'k$R]6r09(CR͘bn<ҏ3Bp]TG-|ioVnj~e<+L>cN MF&ȡXp)"z2al1A8Z}KF39^^6Z{.sTBT1*[0) [ 9$8ʬzӉ)p=n 8~)oS_5q;N׃V /h&c!LNjX5"Y~.Hl{北au^)b-"dB=sg>i܍wr96J`hL zQV,PO>%?Wb >pLX_1 (i?MY^=rh貒ȕ=ph0m/ePϒmaBu;B2՜hDmMBVH%nvF_$y]0uf'O&ӿy;d46q6IlOM喙'9y5xE9];5϶tF3h>2IQ) jZmg-:.;6i7mz,l&nqvF󆁖f;NgE! L}ʙmɉΙe,D"H&`ʊibNc.eAGə9{: Yԓ}}QOmu((!A*Gd!5"s!x%W6I'Z](/r3޸o!'YŅ{^kR9}ӟԟ~^3_PB/{!׿D-n*55SFΓS\ߴ|^-x*N =T*#T|eC-g*3dL]2?EK{f|fLWm5g.Zal2&ZiuYfcLEiƸRKJr !`d,Y $!`>*!qsLs̑I-١"bVL'!|0c&o!Н&z3Z[6)km <&HMO Jy-IrGZ Rz2'34vB["%_Ԕ΃]$DY$ҞY:ag옉&%e$"OF0I;7Z[ XFgx&")6i mCKu@u8EtB։*@Efjy 퓌p6F*8H+OL,i酣eLleld#+SZs*#u mM Π~yv/B݁ˇxLS<2tӪX2J)P|:]8%:* ,Udq JFߠ=w6etK8Nu}n'gZF; k'յy A:Kg<ˈV z|%@Av&yM(Bt@YKio ߸t^OZG&EԮH'#;o$z{}o{gt<4n )v@|y[5- $9sLNyƍ;+sqqV+ǽ՞}Mzŵ7~_2j fVQ+| WbSzgNHRCo}Mƕ' #HnyrUД6c/ܻpAGႪp1䨳3 +-J@"Є})8ibG}'M}:}RdcEjm3*\O1?]ĹQl4|I2h&U'hhυ$ &vMv{n<9A˩C dt5|cv \B* ]GvEv g9>L-ڋ/:.ckmE-v,ɲt7U4ţQ# CLW*:_*JRȒDco9ZFѥ1QȊ4e"'kY "#lK5c"DpE=b˘y2ϕB ẢLаFa:Ɲͯj?a&-QLb~ޞsd|j- ֮<܏u­u 7얮Ux'yg[tknkDn$mQgPWMCB3pA!92di]-Zw?ty#sֹ]vq9t=koyu{цwy4v ֿJꖎ[+w%k.ot׃ IvMڽ>=yo?b3yjr2>ix9 d|"j9PD d=Z<O;p4bJ"cU 2CSh|H^Gu >(~κ,Ԅ*EǬcesRv$B>Ia҉>}]i%_fTl[ӹ_5>{j3s&NOau3qO&Gs"ur$a:"m #=){gAۉ6k]}~۫M >l_Ck'/o;y1;˻OaQLR l T ]L0\8S)'R+TAH:)GO7e)fȳٻ G1v/b&HZCJ,%r+C%?1hNr(ͺ3Pҁ#;2r1q C_JdM d#0g k0;(6zyPMh~փV/o;Le~qVsS#.vkh6z]1=^}&*VZj: c"KtnM/56X9\U:{kOH@V9{dY@V̆Dv'fń2kXPrVܠQdᥕZf^ӗϊF_Qi룣aV)buw'{xjm[dr!FYb-`2$3Iw]`89~XC9/hS+ƢdVC\%L}hfGkEP>mH`3t2=ՕB^>ʓ1@XZ-3(U@&*0ΖR1 HUfuUeǸ89s Noh="e4pub1:bȺ}P-'Y&/j xBQ2lY\Ń."!l49^n vZCKvٻݺn$+~LHV,6Lc/2@c*ҶeXr&Պ$K-9!G:֮MVZ"ʷ>s.3>?>o۟Nήrͻyu~gn:]̳>]0sv+;{UMCA<l ymܖs+9KߩryU:79$S1^y7ggK'^<^?m_nm` p.Kgfn zK?>~GX_cE9tsRt6ݚ1% jɢ9[ŸPfzu{uwP9~J(o Hu$m*͑f=Z*CC IJU=^PK6NwOtͼ #|⣅ewL_.Q)LV+k\K[8[_^oޞ5Ofzα9+yuF;lFWxGkM,&$WKRNziHhVMSYHͳԨ܈Q!6l;JGb̫/"b#fDɎFeл+Fn ;>]Cg8n}Ո18P/D1%.梗|Ծy86Q՝k)qة b{@7sGT|syv:CՍٿq(zKW\K?yq?_mL_ǭ |Nkw[ )Q%fpJS[d4oU ?I^;?sCEhHȥ.s!4KgZsgZB<98__ȃiڢAY4qٱ+D=HÁ oh+, cR^9)DN"WN-i#JFdҭʚwkihԋv\ !kZe2pbs0LKz0_F}Vj*sSf3)~REqNb9)\fr9X6uط"-m?D\j"gߐɕ=qFUK57Er l%̓u24WqVݨ4Q޽p֐ C ( =KJ2sgϯ??GŏKJ `9b4j+ŤٓG-=sc ^k>u X?|-/0!2߸ĢS=uw[eTkղtI7zak0wfɅ:tZԊ:5hRnf->o0VrV 8U2J:)u4'v1^34\|be1@L[a\k&h["fO } N{-ődG"KϗȎ~ܤ1?ݰ?޾s꛳5EFuh-T,۴C+T :y-(Ֆ/5 {yu#)TK~6F)IkLh)"1c oRؐZ){66ץ Z]-'pG{83>gkнmoMEs^x؈#ds8)4xǬSqI7!'FTEm ȕs;hftK. ^ O0QwX\GjWv)ۉrJ1*KuyεT&U:8Vc9בJv@n58oLo1"+Њ[K{NW%ZGZGHc"*&d[֖ar+]q"|4/k7!7\.HuPnżCJ@L͚ȥ6u.#)$d{H[GvivtY.)M6Pیȩ0rŁ1٣|T9( |k%یCb(/u+\7]!x7I%Ir0P5Kf!_ގZ\=>> s; ۉAuUr雿ڿDŽ̲5 nǖ[vف*tmO>/A~xO_p㖃k MN5 Ҏg۵u?y~w^4KWmkEBzI WYyApk?{ƍnq;i b&ĺ%U#q=Ʊ%ǚ褨HGsq^p(-CRp9W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\QW"W&!\Jn u+@i- Ppe W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\1W0chDEBFpU3'k b]pPrc\qAF Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(zTZIp% \ mkWV\JQpu+ŭ (B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W{\]S0 e'SFojubM Bp!ɀv(em> }҄Pa_>%xȝJfzu))lh,-ˊ!˙ȈI#N(G>F5]ͦAˤ~Y冡}mpZcV4ձz!mu9C3fՇx -NsY⼚cbFwNdEzG(s Yؼdp/ N&+zd= WVz<8p# ܭV=2|ez%@z9ȧiF,jEm$( X^ KjiNCۏCAK3\prȽ1h wzȶQfdAږ$pVօ2O/aueLYFH6n`cZb\r}!vBِ!b}sPytPr8FҪVe`ۓtZ橎#dHWGHWFq"L 7s\BW eNWQtute~65l 8x~&=t%eh"ӟM`%zgp;j+zj;&[ЕDz裧ZH[DWX1Jpi ]%:]%V ]!]1 U+ \BW SJ8`Lfٱ)%-[[(xm~4u3P<]Kc9Y2]8.R2Hu<)5G]8eJS2Xׅ*KLTnEKGmք `>$Cc\U'F-*dJZ*LJZ% rҚJ*J1ҕZS"Jm ]\CZ1򃧫#+$[V Fcga:!InրOIMp;ާzs7E89kNKQ0mq2(x +CzmڲߩqƣpZui=١\_( 4|[/{ Iɾb;߹;;1Jh/%˼*ZcuB_EԞson'^>3tj9꿟1 $³tÁkHC^Ch]#CZ8pǟ(Wgbߞ[ \w2)V0ˊsrϭ"/E>׷uI+z_w}1\ׇ孫r}-]dWҴjj|3 {eռ OJv[#Z&*?=fzRZJ/nU|ZuS^Y}B@l3^C7`aᄑғb i랟QeNP]R- NEh >QAfh!q" &5+}.ӎF'0*!^tG~_Ɂ҅GT4LbazQ8L*D)SEЖ i$He?Y|qΛf=7 bqs*xA\ll8Oo}a;9`vNu7 <]x^㪜=ބsZoUV^<'!UiGӭ^7<+%L"- m̥mkG+˘ ܩ`SXp´7h) npǘ*o:\`VA7Ç VM&oKg^_i^+PDiFSaDD!*.H)wx }yhNNFa Vԃ^毬ZI#c1PS<[B-NC5v>~K0UvN[\{y l ojfu&g0ڕwٕ->?]Z/2c}lz]Ifߟ{u6І?߿8sU 䦚5 a7W3?|c5MSy 0(ԋi& ˎu7Ze#oƶjIԪ*$.GHX0&*`2;N]ڡYj&K zK lyav30ǿ/|_RfN_ߋ?>YoabA \/~ՒjoY57>U}z7{}Wm߯fzq.y0|ׯ._Pnv &l=gBTONWa+JEԨ~fm{<bxpӺp}zsfqkt՗'{͛%G(yva3U:!mJ} :EAyI^pl_qXr^^Oƶx9UӗV6CP=TsǷ|gQ6Do)`JPmuQZeꭌ2NQz=q⒒3ELErnecxG]On {+ K v=΋vޙ18B5ݔPh[F;^^}VoogWCSܲB7-2]bXFI[XR`އoڟ^vV`MeeR)"z+I((El,xўՂLIi}%U*h:~yٔ^RrYlKR/ -N:p?J蟹h7Yr,d9LJwbb9MGZ( p3+JE+9M NBt7&9mut4[sb rόY[Ma 6}PC*wʦ[w %3\:Yx) ќ{` s9 >T2podL::u^$x>@$Z*a6h_&e+IAuA!~ <۳78wIae0yxٓyj;0wGzn^<|s?|~Z ԚiսzD# eXUy>|W?KW#̮U"Ҽ~|ׯvQw\e"-hPea˲$2cm8a\1z%up̂B A{w;;ַ Cng6zuXs7PD l;U9"5R[Bi-'P?i/^>9\vĻV^W }. ͍#>w3/00>lS{P="AX(Յ@s0^Qf%pqqhc(2h$efx#)hE[OMDA-ޅ'F9ᔕĠxvopk|HqzX\ў6*oO#! )C66tyaJ]-|C\ 1Ufl7 (fwut1rmins-lJ ?OYU)9E bđyJڰZJ8Y~p,? G#+=Dqg,aB:DNIڵ)X,\R V—-ݫ#+x E*95E;sdV1'Xtnҹ=;{sI|>>^CmԌ? sȤn͐&c~ñ_nՔ1㣌jBL!WI!퐹l".,"$Iy T߽ߣգG$t&2@QYz1yŬC M!h^m%CW:I)h,ky DP) @i bJV?+6)j弬{* /F[]7:3k .:-JrKd6H^Yįw|XV:Vh"d$Ŷ2Qyt]tY2޻ [VA:DV*!PY'TTո B۔T1/yksx5΁qE; /E!*fI,C%搳F%@8  -KIeX-/W@}N? OTQ±'G {D@wM̀=(JV3O;}Չ"%+TT~A LRJ)))8^?=GNL3x%:1)&>d6:$ 42Ed$[1Afb{JK"b9'?v[UσFȱ/L!Ü1찄,`$HͳRR.JɤrHXikG5ЋR, W\4HTRV0kCM|@e-IfQkcUNHYcjͅ-P,IJ$2xSQN:pS[FBZl{)ZO3zlz7dT{| (fz̗o{ׅL9toxՁdtAu(KΆpI@7OTyӪ?N<{qEVoºQ4nꭏY;{qUU-8zz8(]HBAZKP S&cAtͧs ;I^9܂N.Rw-K2{K>B;06$^bzp~)ju^]2A T !Jl_[2z/S] Vt#ujeRޑ0[{л{a;*cz5g&C2e>4NaL(R)ٛt4: z*BؔDn%|nw.d⤷V#A`!:IQB!h)(7LΕNՖW?× ZS6F ƙ`U X}!(u M#'q@fK-'_-eL>;6tvwo˘ttJsˈjz~HnJ 1yAvZ SE  46#vaȖ⬵Qy} pxWIpU!Z ]*lT@VC$xy8[â߲XoS-kPт8T2oP"E=s?@nXlėZx]FKNtYUAJIiQ*KౠTm(ف!l8#>ݒKAh EߧfgMǿ\2?/}vlyazuv>52w_Nǟ΂5Wu9Y耂"BTB?[f&Z&4wtEg?|/<s* Ss< ,ɽF/|PZ^ b Eb %۠T}iV& .넖3MծY\8;턧M w oiJ-S A 9R.]ucI=S -8{ːnv,K5m^CsHowjw_F}J2៮yHz:;ܮxGȴ&}5k l{da%:Ux=,G~UMpƾ6BYmmliy;}r^-l1 ]u0w-X)5S,FT(x0h>b >>' >UHΚb2Ε*{lTBzmB~'}!1( +~|iěo)t&8PH}WXgׇЀ݆B?lZ<>Fk?l'A>@Gz):oż48^$/g޼m1%+ITZ. L"[<8k_P;Oޏ)b)QR}6 Dp.;NNS]/dS?E"k£(#HG+@RspEe,ulc?pI =lW Ar2rK ՙP\&HJgPK"#I2.6 ܽEo,JHdLQ%SE25:۩p/Tz{:,jfإǾ7cMO'󫏗mP*ݼ}gls*˽%9ioK5E U״c .X@ (lHN8ʐk'J!VWzWr.›țEKC'o\.<yP@Xi5/0HFfُJ;,lebmv“koܨ&̯+3R?_+5lvqVOAl[^tBKP S=a☟\<5U=lqm3xQiUaS[,TȮo'|( #Qa~$o}Afq,jƨ.ֺ󼖥"[me4颭 LђLNeKM"H`9&XI$6_UQv.٩&Wp©_2T` "6c7FD?  ,ދ fLkQL(w% Y6p yHP!Ԗ!//*`ߘLk.L@31y KV:M>%8`͆gkŹ>:YɑEc\ . 9^J@(q"R&Jd'RO&`!̠​ža3 {0Ȩ.a{hx eid]tZHNx1-9#vU54ǣ$NxԌ(񨱜@+J|FtꚶN@6pysDf)EJ"qHslاJޤ=Y~Zr6.|x8A:2V"_[}T*}lں[nx2a5ld;zԸ07ZtGOX6Pbȭ|O9tqձg돿n}"{k}CiV_^d*} 3FhwDVnYmX=A) BbCI\.E)=[sVyGo>=9N NYEwbbvަ=jR Em(!G" ] 1v!Ȕ({adDgܟ㸍у\/ݟhv9d{kr6zs%.OOϟ.[me$rKwe/\YlS|"n]:BGsr ^S2]#H^vz#S9>.gmq!x5}}:v]٥lXҤD*ml80a<OY=@#w= 3%gT1;r^a|j>)lG//B]o؀JuCuϽR]) sR?t@&n}~ցR`Z|$cG7 }&v`3w9k 6?tJAzY%7+.sF~}VI}E[26@Q{(ޱrXD"̯//NQጒɲoJsθ`*2f2a4x,m)ţz wl!RGJY+s+}ZYi{n-SH WyNv*V{/?6!249DE{ ey}-zR>4ˢmI5Y]vBc; #mO Rf0vhCjqv*}hqO ⺿4wf4.4{dw/%[]qz dQU-t+D໺lセ.lNjo 9r]yE,YjEځ׻^b[sZp/Vץ-eK"8&_]mߞٟů/O2K_ JNGI ,Wг?(3f4Gh[Ț.h_͹ϳpKr|:9ۻmVI?[F^ܥy>_QƂUՄj"u- Xz*`f F,,f~eŰWŠ߲'W{)8F8iN~Ru/oQ^~L/o({̽%y?/9z+[=6a6m松5S;4CGnN]8% mEC ,T4 rwz}5 j>*Vmhis@+ :T+)A[•ut\`WeLEȥj+VMSE%qFrgFiU D- Jf0"qqłgȭ'XQpu"K֩płփ+JԖ?*Q'+a#&GWf~W=]MQk4•+pЮ"\`m5Ղ+Q JT:pu2@LEb)J"Ԃ+QB2Ti4\Wր%]yݱQ6y/K&8,"UiLZi*Q5L SEb*EWAb؆ĕ+WX1X D.P-$UZբCĕsmM dPփ+VKJǕbWޑժ"\PO2rAZp%j-W#CU',_ D.T3Lz,WrTppEՅ+찞f"W]ZD WOWaîG*((`z$=/힦4=vJ_UpЮ* &J;q%*o:@\p%q4kKǕj:@\uGKͿdef+}l"--L5L&5i]^!|5.&٩m\bQI&;LĂlz3i-;&i=[O X"E"Xor9M'OSp*6pC @uM"7@-b"ЅJTq䡀-' ׂ-W"Zp%jx\J W+ϙUU 5\q%*,ĕ2%]X3OR(W2X@d)JR5k$X6P:D%چ'+ڰCdp`=v5M3\MS= v W, T+\-UƆĕUDpprI_ DvOMLd$4kt\Jo|Aّ;U'xȇ 'PY0@=LZ1*Q1rwzKNWd-.W!Z Ra6~zc10d~~M ;@" Z >Y<wIխȓ׌YSPqOV)yq/oH)?] :&.0@\'wwfsI)2ƛ z蜷2Z]1YY7 a/v|fzjI/Qk軟;7gg|{a,M~[*}s~z:{lVoWy?ʯnNEI=ˑ}ȣY}dcN!9C._[勏HniuOy £ 䐣[M *ĩ*D!9I0X[g/'olؖ6ؙz|#;'\5zVSCgq̆u WfC&dIY0 C e?ۚ lWfan02T~{ tb1)t%J ]CGC ܻwn\`. pz``g@ ${c< w !N q}!PkkV>myR˓nw9oo?idַY~j5''g'w#wGW3*FLg2iػDVVBcLS]ɣ1}06plWX6"8P'rJ>(ߊ{3:҉CBblb*moD\f͜;ۚ<`+d02r;wГA[*C$#9+D $vb;;mHvWg!)Q_+jwV4{jzh%uQr\u(y܂&nN:iȡ֡+[20hFЗtޅီѬײw̗0#}pJd<7XِqW0-e 4.%lБ灼B;g /k4O2LPL ك:'aOrш5ĉ5BFFU"<,rJ9Bk)r#%P0BubmHU,B'B47UԚ )uH AF)KNt,5BD q/ގD`vѻhaMfCj+|uszmpq9,mP#JDQ%%Ҩ%w1C@m ވ'Lb.d~cDP=@[o_Ӂe˃ꇤ{TOC~ ֊gbn0Ȍ^ 1t-S2`L\=+WנgwȲ,柍Y9'Z.2͔תz3~7jV"2ok = Lsar/#6/:{8}i?d<U*j ?LR[bZ{T6e>jJ#3K ҋ$\]т!W;)mKz,l0%}-m]9[6>6}-Ћu#m;u72GX ˁ~Kpe8}0,{!I[+& "k9IAj3GV+ya-'m-}7%$.X:>/ݭm3oXfh8p=wmLQ!9e(Cw19 5hM\6seِAdV/3 |?ގ5|S~\z4W {#x!i_o UI d=s~{Пu/79pI:ou6l'-FNI%t&{M*讚8Y:S oe,3 <* ?Jη \.IScI~0F0HVl;tBH*Ώ hIomCNf͇EYRFo#elJ0's8V΃+8SoR!*SֻPhNienK=?^ɄGݩD0ͨ]1ӌM&݇iM~~o~ߏ6dj4J'S_ɭ[ZIe2\'iRV~yQHdC"zfȎ;:kNjDo8(hϺ }+7^ _0-NZz|S7;Rq^W$g"de"oqU4 JEpX[\ߚ-~>>F۷BFaƊFIj!H[X]e6AԏcC 1(Σ"[JH\I! Clt p5UX Fᘋ{1.-ru46 S9&Ξel}Ir}|epٱ$P"L,,:\!3K"u@3gA{6PԽN'$y+C'P eK:GkKNmp>im3Ln49^{5=9׮O>MnƃMdvmON`vm;=R7z|dzKs!K CiKFB*$Qv(YBVE-}![W89c@( %mF4 Qp# xiTmXMݞV kiơTGq[5av]QeNoi]?M3"d6 hc>B&[MP$K1DO0OMF{.s_:{! &cR)MFE>[؎9ByP$G$j?<%nǣb׮zm[kkwv3cOT:e.gJjk AČE,9lSU@h(B&EI%_%#JmG!f\RP *jmPPxjqGt=&2Ҽb\CF)cSմDᢿ'#eTa9rg6U<e4 #kY ُa# 1!ލL7*:ߔ*S 3sLDSFt,x`8{ӲHj<gTͪTh>+ ]i0_O&D<6_-8z2Y4Aj UrՖ,BLrGZŭP)%C3:vB[qtה' dR HdҞc,~[M=S!Hg)c)m0{G#)lK!kgU֖yAf GҚB`NAtRcfQPĹ(>b2sY PgnCzUf:;%#C mA9 X+OJX酃eL_V2N&ޕƑ#_)eRA3=@70`$傥ZUv,T҂e*dyN0Lՙ}Wyֺ} iB[4#k| g!NOp0)*׼e"5̸LC*^`Qc.Kyڣ-@JC+u\4ǥsJXitTvRcz|pjjޠ59H]`mɦfj41$|m l Q2YF*XO,Z?Cz 8+dgz̤-`911ˬO&eR5E'4^HggS=]{O78ᶾ?U7ʎnͅ뻂ӶHPO}`M]8eio`Hg! 1&hMPÀ@!nGDn3b;Y kI AXctBdMUr$,cAZʭu]t)&:9g42?cEK)ƛQH F:KL;!{7;[ٞ@!'ON\v;f fbL &>Y y0=\ R&jA9cGe5#XVh/Bu,zzik]R: Ց0IaA(eJUH,ܬg5tP=Y°6a])qYݼb`,Oe0'S~H.ko׻4jw]:*eP2tcO1&554-2Đֿ{5 ^z'5/"V#H٦2Y5:]L26"1WS zr2i~1>p%CÄQlt|Z.<1]QPBD-9kJ6ZMk` l :9 |AVvq BP!UA` r#h,s`EN۪`!:8'<yhwGcc+vhLsc\=6<nUkea>֛:LO^W+tWW.+{=%u¼]`?,'@ %B흈p i>_3^oז_щ odQ"ǐk qu0wtXu7} PkrQjtV1w$n?毣.%YqyZdWo5s>{M Hw>j=PFF{4_+~:}X:8-e(|O|f?5C YC0O~4Bt-{}KG]ͨflf]T޲<9n(:u'?~]Nxip;[UbsAv9V­ZsټԐڰe)GMy*z/x~p{e5rCĿOG4>?~{7?K闷oĻ_xf[xg읻uߡW߷oڪMk44mb·irGW%m V;k@Z~IyMJ<]gyz+Cf;dFp>^ 2;[;bv O0LrsP9~(K]AFXc]njQSOA=@R e) z瀬! wZZ@H;s149EQ|YKQ͠Q5h ^o/Fyzyqܦ=tLӦ;rݿ.nVX99qp%G R)lN BPXXgq6.G Y&zb [PrjVO O IlQƎzu;2p`'v',vj/}F̍mjk I}Œ/~Um`~Ȉ49q`;b#39-UJYKX`UIRQ05btW 9- F>6zLbҘ\;%P2ATg[wd9kB#d$Vrwu3x[p}NE:|Ng1Z  ^?d>gsJ׶ƮMk4C29 m~t}Gq7?:# ׻ݛ9=7Tͳ>lf!?v^y7?|q<œ,u?oyuQ|<3R Z/˥x^4 iWm)vkͨxCr}:T} T'$E1kr+:"Q| F AEG>$W8()gDa%tFYІL'LJcU O G}jIH4HP|be5-BASvGS&#5!U'd e睵 jWt915rz Ib TGQA ¿1' P)TS2?d/oWZ+bT8$=+lv9xŸd! "qбұy,vAytx\4!OF3yO-+7]kv㿽wNQuTPo_s.Ʈ ;oU@dapm%Pkln)^(q+ #A*' ';l%>| JwQ{i@po G;=>K½U3m49HAЖldiJcH&Zɨ <(``=]2됞- 3iK)Xpc hL2IuiM Y=.iw bw=~m}ޫzsŴ-v>3N{]m޿ZŹDʫhmRc@PL9#r$]?_񬂑P$lSp,]l@F0WSuMs2i~ыs%烳8l<>x-j]jBD-9kJ6jgUkI ƨ Oӝ}_hH jF  S;FLKOX9"dm0N^};}ful??r-&rH'.&KQvR3i*oZfq|tYFX,.'⁆3 :ȝlE[:;S FS;6|Hهs4{pnDu*JQNBE܀BAQynzDeuOO'b e[ppP~JdzK,._ Gk 싲I27ݘ卝/dx#SRkw>`uz 3|<7_J>ƨ:{~;ɚxdD?;__IG;Pm##yma:UUQ̰dQSl3|acﺹ j6{vU|q9ꢭ BV'g]BЋZD~NVX(u|uu?^}~k ?zʬyTjVEm2fhpmϻ^?} =oy˸OX,]:Gbf]1?o/GFW'ǧm%w.U7mjAi^ODl1U⠬/K|Խr\A܏l V,H;}ã:JhrGhB2+W G*o9U(=yay|~c 6Jv ;YT+^g.,2x>h] TTBK؀Seq)k)M ;]vt5n:q-ؠ5+?Y]уzUela鋝yՔYrؙ2~r-yʅ @V'NYRJz>J/vgW`l*ZKS4hsJTSjJ `LB"wIڣ0RePbUu*@ԱzoI&HYRNS48$ijqeܷ-%vg{V4D珊g_/O]wzoOד[NY4NW92O!\(lcs(HY_DaPYv15{l\2fz- :u5{ vo>BiKVKgشwSpeޣ'EpS8ʐ@ɕy&j~MJh&֚jU)8mH{`SjNL:$8(S샭%pѡU}w9D*Cď̚tږXtXg5 $L6\ELs;Q\򘈢QH*t~>_9{^^[(fp_;z@!;(U3W"@w*ٕ`:r89 Sw s=6=hy@hh x*HD֌ȚY3""kAdm1FT$F0ZGRdja xc {tF΍VHGFݕB6Uʄ౅Y+%6 H ;TB*v&3[,2aDv|=j/e9;*a{i6]n)-ӟ2?u- dZsj8.5dfc)$eISs_,i٢˲Z'Cs}jjU˕Y~.E,]wLӠ>P]*-@ܐ 1V5v!W77L]ntV1H "k,2g,01)]шA[|my#j*եV石) L!G"¨Y̘7峤[zs;l:-.1_xU^ljHT~INټ [E({jlZ1o l O%x0ZIL +<ՊEe\)6 |p.G48{v:ۡrx,{,T\TceMM"EOz) kUSO՘U.ZF)RjoeJ>o&*M' )U4mM,PQhcU}PL3Yu*VB2"Q1Uk ldO[uSv6M}He*aFmp'@)JUv4 26cnr_밝~Aw64{tv!9eX2O-DKE$$L&+ Ϩ-xói oF?+l ..vύŪ@#VK/C@llM7Rx@ y=n (GsKy :^if M),ۡ3آSʎ}7}k5VDUٳmcdh !O+ý7}tvO\zc*FVmcrbhSSo9=~eрc?GE՝bڽ}5?<\wQיHAUgC\05Er|BV*ٔ>,{?r?>] )t5_5U3ES?NvWk,]Y"==]Ҟ)3#,=LPm^IZ44_R=f$ysrvNгz&Ґ'aCj=&L Y䌾tU փ%(J0Ny]OeA7όAʷVv[ęKE}VT,DwT<^y9crtYe`}2-zZl}GmB݉ʉAN|7^ts_IyK_8[Fs NU;u\/{gǑWز<a̱{X`)5L5a\#ȣϗe-~uէ$l R#5jJy)qWgÃ'}M髣}]ħB@hAp8xA7'G}_gR B. &6\EcZ?͐oI/r?.n^ W,},6`h/e!fzpq{q^rGRIr0Χ!WҊ. |O۸Ճx?/p|WlkegiH06VgdC5O>}?Wum-/P.k -b"Bݫkk!u.>=Xp\xU\_>{#0L;52?6n40ZN,vlT"&,n=4>q^rg͸ͯ82|s MؔחhS^;oa;1wp|fm7>IQwrɻxOnu//;NtQoN8~Ń}g0t#xOv}TږS<_ݽ!vș|DQEKZ$Kj%|(Y)i)T[~B3˃gVZmAyޛ'i19Ԭm250؊ IaҸsr#5qЛ%upZΏ˦ƾgǪĭB!Y?<MRDͻP„;iMՖ(8l*"Q̤OJ=k^:g(.z1F&l.-6JR`ƨ1K‹eXud*s=־Js"$v@F*e§P4.CZ=e'cAXr$q[QPQB mxWCkazhm~ԠӊÓ2Ah7x(QMd-}(q2 ken±D6V:؜D%`ͳ+!)st5f,A1x7 ㄸ[=FE ] * "V$0<"f  LFk=42i,ㆩx*dM'(Mi/@5UE$Ic^ϫu J+h;6b!ՠPwSANtT(c)(d%PiPY0(3)ޫi6e%KM%ԭ)) :i,,∝ua0w*ɦs ꌌ7ź*QLK* 9o bQb8 Epi7i,3 5=(bm LcSNT di D$8J(46B je| A$zv'UP:o3V7_֦L ԙ: wA>/G6^b_"BbL@r|bQلzF0D!0ED]v9O{3xQ ^m]1Ǻ 9/mZI!ƫE G\ P dO%c)іF@˥W20w:"'icɡ2֠<%EHih(LbyU,CIZ0=Ӆic=7=ŋEK*Bz>7C6d m*~XTunV% 9UϳW5W1HTܶ,9e58x*@%nW˰?y'sOYs`I%̾6B@Fx4= #.@ AJ,._.$pLDP-F u $$B jwpY֛Qs4ڱ-< %D/ ϔ]IO>6]@jFH;ZMV78t-B̓x{osP:$ v"(άAbD#EAi9DHT @Q%UPբcQy(? >&@JY2 cc9C&ԚWXٟ:5MUJ"42˱ o B)4I[q/TNIc!-AܭE aU M{-{0bJ-ShӴA[{'ϗv/`t>]X0=]e̵2{mNC ݨJD3 58 La>|w9,:*5fߵFj9~$RV ]"j31I3rp1gw·UfU(g80)QC^"ם*1lz(CG3 EӳNJd+aLՃ J`P@rH҄OWJYHXB7o0hl3!*OK "($+M2P4v3ߞg}]V1Z؛ zYQcMPGb$dӜĠ,b"Z`pOK6"\ cwS; X;uML+Pc͚ B%컁']FAL\QhA >Y ZZ-}[i7."SBmE18`M!1  f`6萠e)-xvߴ g7|1`6 ]#Nkǵٝ-QWTb%TTTҿVӳ#瞊cOELTH"U|a<>k34YԮgyӆzò!UALS nx ٧'^9ѽ["nkW6g=Kj-څjxJRvaM/? T}<4{ϱdF{6sѵV@zN]{MlA~w.n&xjz{f'ܮw&G?'MRHǜ4sI3'͜4sI3'͜4sI3'͜4sI3'͜4sI3'͜4sI3'͜4sI3'͜4sI3'͜4sI3'͜4sI3'͜4sI3'͜4sҟ/'M$N{r8ipҨ ~4T9ϑ=='] )aO/Q ?Ȉ9i.41 `6"P㘑FZ>=FiA8][Yyq[o{,V`+5=MO5CZjpM]\%|t~XƔbd޽;R_,76y[#O~jcub|mǯ g7k \`v|~:/^_׵vw}-@?L\" 3\3s53\3s53\3s53\3s53\3s53\3s53\3s53\3s53\3s53\3s53\3s53\5\`'_s s{1̵3Bg\JkGZM<!"ד uqMvPeTRLj|H&0l15=UCjzd 꽥b&Z /hҋ MZ]ȓ^SȱK?(VfmH=2Rf oGԐkH=yI;/|8gnXp}~gA'A= 2 h A34C A34C A34C A34C A34C A34C A34C A34C A34C A34C A34C A3Bؾ$)ˁCb hX]gs A-HJ$@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 ]koF+?8a|h o&=hc{u"K:4 ,IIe:b:%r̳ٙ]&P eu@]&P eu@]&P x2n5Kr9LMu@n/ f$ RMRW $uKpFC* UWc$W %olにUXʣ2 \uG Wpj8 )g}IkUIyW/sW!7!7ϟdѴ*7pkxC1# HsWnFR0t$PIZ{ǣiN* UWcUtVc+AW d!+ѱH>tJR#+IfGW 0#!ˎf.I+UvpJ3`t;I̍"ll&ERK8Y |/~0S8x:f50̗myI3eDЀO?U ύ(gʄ\SsBɀ>_L9%OvwsrMʋƞ+W`i_w^l94bg'dYA'R!Ĉ@MY})IpMtX0#|JP[OWfYwߞ3.?wU/r6*ӧKrY<:>ͼ lO Sw3ˈY/…w%o T̯F4ތiN!Thθ\~A}Ư&#`pkJUZGf2[GC=|W[BB[T6v}=z7lW[ay}}j-VuuYܼlڤz9ac$1ĹL:O=R[aΧ[F)đ_Y.|0>8Is +Oy6Wذ X )OԤY+3hIɕZX޹t6LqkJ㷇e\fۍ;8-[cݪC MpN7K3,c*G\A.I?12 f0\OhI`9 Y `JJ,{S%i(?\I1}d#{7}YYLyθu:TzJnpf42ť9eͩ8d_{B+wNFqrsisc̵4 °=~..o}=g=~&OPO1O1N}WAO N<\gqk7㲒Ҭ镡g_+=˻uw(q:ΙT6ҩ\4/t6{hgZϡwݺZ&K F8B GoWzvK# Y N®AYL^x,H.*C4겳$SI+օ ȠmMgd;щ*j/𢖅f{_,ݲے0βg}x-M9]ᨘ]q:?.{C'il&:W ?ueշ)wߗJ~LSR1\ \-`.ljs nۀl%tı2BKJH:`+)caBSжE` i̽4aʭ(Y>Q;vYF\Ч')(c]/~^ 9o7|W-5<b.O 1{wEҡ_u5d5x].TWj,1KZ99}; []LV*u_.@{)5CS\ܳjl3keDlڡذR2ҹ'qz .x"ė,ɆKmWE.ʠm5YDeiy`>m|uaYŭu7$Dˌ}_f=7t?혣mT;bD RJt/%Zy\Xh[$i!O^-\]_ܪ*Ԭx5u}{g2aN;F0$bBp;vIOARj)@Ky^l|Gkm%5ݸv^ nfeHٽm|[CyvcY@V-pŎnpZ" <$o7սk[]'L/3,?$.=|$h#sbJ`:|9~ 2'.|[ ?2mbRy"mP&qx  }jb1F;S"sg]Go\dm66 ;H{XorF@M'yG#8n i_;fPW(=!9bkY.DwwCw,}8c RK |8a<+c[&%2&V C AHRVa SAJ &ye4zl5ͭ ّu#dXXuV#=&D3"ꉌ;e9pEH R8"!0ept]VPnvGE;80H$Ԗ{+%R>Pˌ픁i&r7:d߸v~ qx_%ummDA'e|^vʭ)eMe}1W5T7**oOmL B'F\L&1)yS9dk Ic %ZI@H1 N9ť 7 T̀< g&H۳U¶4c_,-c޹7 Y$FU'9SjOCVw~U 7_Lq&V E-&kAHab%SSQ}mDlZDn(,U42OP` yI|;t$LrTo#g;b7\CAִc_Ԗ-P`7i<Bj@IQ#F`hk02j$AzZȅ4 و띐R@iןTu7(` [WђJC(j٪3[eFlqDbO?g:wK6Q+0`|(`=s 瘽1g2aIXhE*2T| ц!j*ZŒ&yY!xg `!YP;<؛ѧ1?VeObh)Z.{S8NڴB2k`3F0ZǾNugKU2ix4O3GوEbFQ* *Y>CQ*RsؔIЭMu]e-T; J.`&5ցhBU^RBTZrt*иzyg[|Ӥcc1Q6lPpC( ].(B@8W`E<؈t숩9ֵ/ܧMhn0M26MiKiز6kJ}͢Iws<tG{CT8E3ʹ[EUtԊ;_F,E[cL:JV$gSiKn, S~{f96l|"`J*R{iӡ-eؽ-]ehI:P; rZ4ɁqT/6UzeцsY=:*QyT)j9(,:]vPmAJ{gc`kp0ѮJ @\Knޓ.9`ϥﯚeC7 c;'DPvDDSWq gu.n0W:AwP풥dՐb0;4nWqSK #(*H8;u4B m, ? *ps$x3D Wm7 '(:qu`UE{BCts1|VJ|KJ&o,dK.N8.Q/D6V4Xĵ ڵL)zZN9:JbͬsE2X9qmm:Xwnkq [7y|^{5E5z=XnY–R?臉*k&=sM6Z(E] ]F_3\,K~Y"F"biPupJI׀P ]F!Ԃ.8NF8a7g :xaZ.c:VyNPQc|%c`h Bґ#`6*\M; lAVqd @AV *J(bhF9"i@o~z}>. -y<Ҙ'ʺ]56xe{d;]Q9>Sri1laFIDM _x4 WG\h]}ihD<<]c9`]iXESJdN2MXA@(J&&9yYPe@!ZCB&TJ//t?cm)_l.޴ZFɇ'4`~>׌&w~Z纲DMMf]5֧*Xp{FgsN6 آ+ByJUաz6xR JODz حW@dK6)9q+!.N e^<ou_kwI;sO;_JVOI_ޯ_zxpK Y|N^67Iwn`vĿ^S;zsz΅Y̌gooN߭^x}]Uev?}哵ɞ3<9^KGRZ [w}K1vc3E'Q~'؞b{zqlv|"nlը[]rS.A55D|vpDOz՝fկ$* oE&~^lCw"zo}7߿滟ho^Fv?8fj~݅ܯqiioݴEئi7W|vq!ohvF;k@obc]]?iNg{Y]m鶨2fwء[^a3٣/!(n0`]v97rD%JpDZtoCH6R d#KSQs4ON˗IJ[Qztzzc6Gv;YT',DrYZ#ä/x,ŧm4 rñL&XOa5}w^FrAĺ&||d{/Lv-Ձe;T;pekR R~ssܯ4^˛F6|HKDydTZF@SrT 4r*w3%:::(LqSFQ-:p)dRUY&Љ0idM<:1waCH->tqTb+7?o¼=^Gp5"qctcŊ.%&BI& 5 i%3) ʻBo/Awiw5{؟ִ rk (Bʧko߀S~^yT]MYOziRg)ݧR#nyrziJ(uo{X^ yA;@KaR%c (dyy$aYI貒e% J"ښJQ) H^[TK%b,*hr@Wk уSe*YDB0kֹhG4rUga% ݆$;B m֝yj]w㵧ޛs6J+jix;~lQSf{dF7!^0K}L起_ü;׶vw%x@~W͵k]sڈ!T⁀9vD@֎:VTwreCtUjNm&`{ұz+9AYbO-9uq&8wZ9렍AbǶP+UZǘ3x62SԐpSANYOQw{|%7ݛ5Il9HUzu;2q'( B(U_TZz,XEj[6aJwmw`Ft`bl/.E-ڕ1aNf! sKSVR rhǫ@3Zř9* !j|2$3<֝g[C jeVW;9 HG,Iy7V~MnG b`U0T\tX|ԢEk4w0][<^OzkUBmU,R%l1C1@ѷ_* A,l0/3kT0ѴQYOdCPX9egugK=_z_ *ǒ!A( HKq>m\Spdt=j:*jb~b jT'f =eb jIrI b 2Z&5Ԣ㽉'M"P|,"J!m8ek2ы(`"rW僁o}a)9rna( r4h+OR!i-Udaa.B&;z?E;jX;ܗh=Ynͯ#oVk=h(kiMRֺM|mZr_p;QpvԁC!Q*r/ `1#wS~u9(B!Thh2VWpml锲gm^9ml{gc⢪s9[̅P=#Hg9ȗ >([;C[Y + QހZ3.6.K㱽͈qv~R"gQ1bjpbNB=|wȴ*@˯!zYԩ]Lu℀k'@wV8[GUiR䳬p۽gaO `uT ;S/{^ųx\6*w9.UtX(xoKgFE6jm!䙋%6Ax3mY zy" cVgD$k5[ev*}. YV}ScNym\!.k?|ַs0!<FmҔК=Z92n"*;ޕ$Rvc\R?t{i{6<%)KR>UE$(IKiͪʊ̌2"22Œ9K˭.#ĂeeP7}[Ƣc\8oB YXU"yY4 ;T Jpibԋ٩BS3jRNR+ T.`TeYNG)38f_dЃS% q@xEq2|cldisZq2OBK-@'~/z&aR:Aqx? 7H{8*:}]F7ݞ]bhe~SuӝI]_ PQh5S_͌-0W7+Vj;irϻKa[iӊY9V[_ۦc6rv[rmFvy.F MC$,ZO Wex ͱ 񺼩BG#RlOQ~>C9Rl1w~ܾ#R߶FXD9<@0fS'Uw<@-AITy !T<%f- c),cǤK.3AL$q;L6YZV?V0#*)`h-LyZM0&=9"% 6ZkZSEkYo·`ü~NZYq};kA? dRKJ1}DB}opM8"ۇmᖷVqtx+^ix=ԊyVLFV9i%!1vDI ؾߜH*`Et"WCil}0l:\rB_MR2O klc ulsEGah!ԱVX~;M%ܙ?c] CWj?%{$Ϙ@s'*(5Fn).F?K#r;̬gv3ҳ7N]&t$wJp5N[2,Y?$;،K$衘@D؉J&7hbt\%5:r9:JԒGLToh-eÕHg9+3f+)Tt<VpD>@&* 97W q:?WUY16ibKX|2_<.|9v`ð;oX%w*LPjjcs^(gҜMFASx|0NH]aF}_Gq8e~?L'Mwܒ&lՅTk~r1McYCXd=)Zf҆3 sl"X@ƝUT=\b!B(Fy0[`kzղlNaf&{웓/&fzfGc:kc9 hZ4 c?l5K{,|ׅsrn6sv׾.؁/!Fp[~'j+ _o),*+J9+lw/J,s@+9?(AၱXjT ʔ"xc%7)UጽVc[Va/b&x+u` n>(PFS.` E,½YX L1F: lp6H, -d(_PBko *U{:<4sݦ.|.Lell`6v{r_^VK *H ΖV̏"d;ͧa+go]5ܼ>/ѧ_NuJAj-7eփhZctH_!" K'*QKcLT73(kmm҂8-`-JS(ނӕ"I{(Z+kƖZbsY)2I^ i0(x,c`RHΪLG eZTZ0{-#chnDdV Y1TF\Zm?u'ղXS,)N5x3MabKA" YD=QqL #A*G$洴 ׁKMh'&]QQq!,$S[)rrYPA-ftsղ-į'7)Ѳ]ʍN;[mj+r:-1\x_% ZoMzKxrVEᩕM66霵7YQf7jK00iIa) ^yEN",bȌ∅{Eo-'X!/}k"q~}CiA:{M@Z,M b%USQ~m AU6b-O !{D%$ؤB:`^e) Ӂ.D@{7"9ێeb>ۢ̌G v#!vXX*eL1ʥ @#X>搐Q K#-3AG hG9AQ.UQ>r0._ellʨ}I߇Ƕ2#:"qBS Vz(C (9D]eJ `pA1"EFjZaB6Q#KFbS, A94aK3#b6r6#NjSme#\יKE2>qaUc cE?~j#r_͠ǭkc;#ӽsMF{ <1xa6TPstzΥZeIh`YkH g#zP"#X+냉kʈhA #(H8Hg1sb+ǓѧբNcз+ rҷgYNuVjy_zz~k0GŐ5F1/fŴ6?hB5V{A   ]bt!86 F)F$Oc.o5`j4YKt8 yS5KI DpC@,s$}P89jZoAy)gSfQMi塹>f tjaEy<!q )F8-T"Ewo& HyAxFم?BJ9"SC܁2:q`eYdJ(qQJ'r2"Gy OR׍$?Aw쁑फ़!nTp+u1CWTjY}wzSMsۀ{xq\K&-t ͋ mF_Uv{9 c7.6ry #- A2ae_}S͛b )?UȺZh4ާ| Ug~Wve}\ {5L#PzZX&TW BfK͉FԚRdt-|&lﮇ- cmU&}*~Iܤ[U7<+oLOEuwWL~#`M ؜=yz0 Ab=o^tg RD9ժAŊ:A8dФ`\I1E k>E(Uh1lʍ(ZGUeq Ӫ,Nopׯ?~i{S9ioC40}]8afDw/_ lw5UW-<\!_.0m,r3`$sq d-i$9g0}%G-V8#~tSd5E*NUi-HSɢ[ЅÕ߂Pd#Fϯ>|\}| /kC~~ٔ`'8LN?'~3>zX4nrtaY/O,SY @0AUV !g+dy}x@ȗ몮5YKXs7K|߹rؼvIgkd}-:&%lTUqR+*8@7v=jeDwLOv,6Ys=i3f\nwAX:2 k頺'JpU )o]c^%01U(~6 5nw|Y.fm73kGpd?DND1r IQbHH| `}yyNzN s5lJt!ZvtpozO*2lFU>)SP%-I)P`!(\^'^q[;Rnqa窯E䚡D$C* \{ (2!@TGKiOA$Ha T~б;K]8F" I(2Z&8=vpK7&R26{閇x$9qchXb#& m>b TȜ74v3.m+'2),)h ˭ғh<ηka24i{Ͳoy6n 0/rW]ƀ :]ՀZgՋjS[H #|[Wi&WF;n9dT4U~Iٓj4yg-\$x>_%yr1lj=/?~Vܭ 8JJKReɒ)Q .QSggr! U>OV`w50 %$ y~q8gPih]t?U'f{W+/.#Rg7*Z̃4мJY^'j} ^czQEiN{#,{38=h]$u6~yy>z_ޝO3Ku2H/D!m.|P:0eHGHE*[x-Z2+c^G3lX_1#} fJhiM.!82D (4rou:%*E JrOa0 4 [G#0I lI-bueڥe*zF*N޹ujNlrf`<B:PM /[Uκ}yN7nn>(14db'%pNsMA"+$bA[nI WP!VHOQG#yveTYðaEF} w|_#;$݋;Tx2s)!FfZe%`_NB[8j09nVRΪhits&Z 8|>`&EGfL*q]t%~ ފϧej;7}lo&6Eݮ,,TuoY^Gql:hXXOd!ENJT$2`9`9rT M^+)K$GiV &1Kd)hM b?cNNw]J&f_-"f:Dy %W31L-jn~ Vd9,OŵTL ]'Ǭ(K[k!ڸ@{x+ǒpo,DYg2z>lYǓ. yMȝUi{te +sRx>\Jwzeko~\o׫ս_|U ݦ浒C6kc-bTSqm\yxqw;fMCI%Pv/rvfT_|m.6Dh~7O_Y՟;9<PM g7J =H~S;s"h{`sbNP428InO".BZ@kˤLHh!wjCqpl.pC !J QT.53<o P0sKp4CՄz&0"ZBhp" љ $P2kj?)r/>VT?X'O{ȇVփk%p % +יX:!rK_zniЎa yNI~7 8OtZ3*gЕI#(6yE0^Rmr 1 n:KպF:pݩ$7n(E8]O: <)/B ZjeePA7C=͜W>%UB>Yeʭ=Z>S % Kg&rrݰNru5&ncnsݖA\}+M\5az3LRl㥜nZ|wp)&)x0DB 8(n_]&jrv~k}|>|ˠer&q4mJJXk>DPR,]Dyktw Ӕ,Eɠ@}(v=~ o ;o)"hwx'`˅Q0 0@g 0S授tωTtt/P[]9Ey% 84F&m(b ڑF`hU!kp_C$SV2cؙ8b=/ (q(ksv{hsk uAoѤ|C(jS~Tu x&wG7M8эĕT>lID^sLJEH Ǒr[jܼ~Qbv&P|P]d*IiAO*98ȇ6ց4d#潳yn[D$JX) I6)ɻ6%f'-DzV⇨Mڨ}ǰ(ks#,}2SvYSTS Dy$JO݉h,T\GRt oN 첮PW}vFu@77+@}{S]TuiQ۴YJavj{WS۹H!d:계DT@* HTѨxdFo|)e=)eYl*C2$J.EbhY)ȓ* ?N+Bn#4FI%ЈU(G-!1͢^89ξԙ8gC: jejmWGIJ~5wc_.-;/7ʀX%E9 4H,E d&q|vig8C9Gc)P&3iL`UҜY%;3Gq2h’`eKK`IAL21jwgs/`]$cP<8h(Bwm_!.&~.ioi$A03kYrl!%YR,5e1"<;6F3 YR "R %ǢSN߳9?cJE| @ J T6ŞjZa2NZybcIS"1@;Mz'>P3B=:g% Vko8N)$Oz Ȁ2MKnPpoHԆP08{A*K#R` S@#1ӏr+"([0nveS>QxҴ_e0Fog\h*(^IZ(EҊ"r(h,𘓱y]4䦣C[\=m`}WlRM]AHdWCBlrO&,w6æK>G 3Ǔ?  k/46!:[KA57϶]C5b'B!N41+^!sTzp&""82 w 9r[֋KIyuImPI$% 5Ƅļi Os9EzܢU^8P弩sl]{;፱"G>OL>zyO[HqU2EǣaI%*nFWd-`)ԌG8 [5A 5;zknQcTM>_nw:ZhBX=s%Q7^K=ӕCڇI'BUM6skᬪ.н5sR$l[&ˍY"͏ [-Wք`!pJ>K,h`-xk"&DHFE+=_IXr\_007Vs!#=T7^:;a5vy<!BK𬪖NnQ]]\gh8:qNȲ]vףpS`\;ka{nFXhדmϴi)eӚv}~x2nZ ŊZ\X3c8R,kkl!zgqf(YHry1 xS5o 6*R 㠶ܧg@|f;A"WJl+sCNA)KͤDJG)EHe8Z֖-th,ε#X_cOt!vE>G|1ڳwM:[Gȱ'Eh2x,À?ThFX?X-5A#V!!ʐ])I]rPD3Xg ?mJɆ[qY1 Ds5a6j͙əT{*YrP:A1qG#Ǭqý9>A&P~uOwٺ5Ѿ^"[O/ᦆS=Gh9fJ.bhH$Nw Ш *%rXCDp0㠺~w uiweEN2ͳ6pw ,guQ|_߁,s8>d盍wp`t~=aeɹ5OU]MR0Xb/'{]5^||fގofڞ=nsy4;LuSLᏣ_=):$k!}mINOƜJCj)A}o7)mxzz//P+jAq@ p į`tWz9:Gk™bͮ~׈{}i՗w}Uhǭ‘7K/[~=;4⁛s6߿|jwtyv3Ł*;z|j.ohj/l\r6ۻә?Mb[p"MCfk:hr$c~2l6xgt1٬.4[F^=ŵu"CĪJjtUifNoxF˳ڀɔH"F\O9nbRT9IIO1XG0GQOEfǼV*ʮ7< S ,FU>)e2F4P#^`Ѹ ezE@/k`Huӓ]/7$v$oY^XõW zR}OQ+_MtuC-! TvvHu9`\핐HМ>l6E]w]XYaQ%IUQ ᐳHVml*4fDSFeF*-[a[EY+{!ߜ-HY(7XJk#f|\wZe4{2zس{bG·Ǝֻ4q|J(!Dd|2(ALN rA?kC^X`Ѵ=I IZ_mCNҵtئlnդbRFeu=}0hyhgi?sv:-{z`<=!GhWU-ᱫk}l~ Gg5KQY|vVz;;n%܁r| g [󥫹B-,Tu.0.ѩSiORieBhFѩi i6PBz?Eeo@EE [X2e^9TkᲫx%!:2D!Y ("NJx4$hbϳnɳj#`\WIw*/Zvϲ 1zi0i-+=ջ'gdNsg~즉4Bc~L)uCxɋ'q?]4aWȲx4|t8zcaNp;q}VoYV'0CY2E8Jd@lY;O-3'/|^!.) '[*-np<>XװfÁmJB(u"%^;l E;zMaw.(q_w!@A&B+2pgj{H_i 3ҒuyfϣƳ/kH54 O/n v2*+"#x1FG,T!2t ƞKf>CEl5op^0^7CW;07mW{w/ro>KKRvJFՆ!Ce5 +Z f fvQE~B;GA (#i1D2@^)NG樂s9>;99/|}cS[m#aaZ-\Ȳm~;>z@"&@MH6&Ahj1dtͩE#rb# rJ9-UJH U%{/JDVĿ-2jwzHlQ D9SƘRP%M_.YF![69썜cת+@w ŷg5;5{ͭ;-k_$r+ir4VCXѢN_c9*l \;6yyq%w^r;?T~ח_~gq=_ܸ.KoGo:#n%Z6\Mh m17!7cDbV }!~ߙOmdW`\QdH];jJ\Q: c*L5SX ФR+\BmAY%[x84)B%IY,9lzٽ#Ⱦ\M#Rr +y&Ӧ-n!wd%mB$Ʉ{9F-r.hia!X#E4.ѫyMj#1]XR.Ǐ%(Rm# - ꍜ/ 7SeHbS&H i j'Bȿ CU+'E+=PL…QXBXrٿ+a ؘfNM bDH.٨XR:8$!l-x?kլ!xNO>ɢz:1 t oNj 'v'G=3:1@.s+Dzu'WFIӕ{:yN(x"S'!�nrv2o!6ɼ 3~S`'xs`\Ґio8ޓc2@SR¦6OkϥH96jcrQ!H^9yCņ68,m/qP^d<)joԹ(!G#2z*9 1! 1KQ2y:`#e%:-Ӯ ۤG8TqF_\Ԗp=V@9ୗ!@KiK{#2#o!zQ\NMQR#)E%!FjV(ĢScfj\pz/>BdaFRW84%tzXp,XQcӣPB =%@# XkwjS а\ u[1Iڧ2 hUOf?$y! Ԯ/O Hƽz౻?X[4A jy1?>]Yj>cഏ$BH\c@cc4( fˌIwDDlA$ƜE1@'5IƜbV TAxIĆIlF(g5JTl 5N&ei1mG@TR6؜P&`t59;t)CJ t73ϏO)+s5;ov 1\yٶFOcMi4n{ՀW]5bF{U/jNViNoB㟮 l튍Z6R>}VȐ7d Y= CVÐ13b2kA@ٕTQA{mui6X ȱW^GBO}+7몲Ziy{;WJP^*Ht-EAW0C*,E[= U(iR`,-*)aSYf%ꦂ eAȶ`2jUUB9 o\cŦ,o7E ;tbEzw-#v} k*?{ N%Tj61^`<2WT@QBܦۧus!Q7pP"%c3\.MRe6"z X#)A=^kt?FZ1^"iY_UʵULkyW^Uu Z2t0uurA4.0ͨ# 2di)a_CHRdA mo!BRyܞR|B҄vZۚ]Oҥ?V|.:ߤmE''(ϝ-ʫ׌窻D䏶_%:#^K&V:/X^Tavi&{eq>ܒg)֑y=NVج 0{TPRH-ީ{㝮{q6taov; NP4[Ȓ(H"f@#.xB+pP|) wɔh}u4KI7K<ukעO]ݡF`5c>c{E<+/ oYҔjד+O]=_^݌gu~e{UsbXy;Y3(bFtqv6N/όFG2vHǛQ z00]x,VM~2=rm`JluFΛV-$l}zr3^ʤz垜B>[MrS<<<}߿|?u3pNȊao# $z;-l~C[d_-U*=<lA }I,y%^z+FjKId%u@o(ds!DPT|.$KazRPEgVH񷊂FtŚ״q>׸LR-O%8h2 %F>BKHqI EԞ⸱Ό/NRֲ7ZY/Tl:2ȗ}8~ծc()01L]1̃:j{e^OJxry׾'gO!JgO~#/E8C?ه[VO[#ר{_~vz}.dnGeXL,-eN0lG$B^U[9kr.؞o<w]\ hu@U/WD] : .ڹ:3insu–c6tN~j?Uݓ͟vg~9xs8] xQ4l:/Oyɋ/{F-M']zzv:R;OƯǶD!kuڣB0XI0iVqR(`*&&ݧo-zmׅχEQTۊh-vbsld-}F:4xv2@=n4E~I׵8ˋ;f}m$ sy"llAaAij"'>G\xT"d;iZK/^ ^otbP6۷;a1Wd?Yfo"k>>Ӂd6&$>ڲ.?bK}V?U?[䉭'u+eܣAltʖ/ ؝^ >FgK뭪պ(GW~a|C3.J_(P&f\(,iѱ$ht1Bׯ$AwaK ؾ<7f uaz0rVB)I΄,MY4E:@83VXK rVc Ӳ@(+jkZ5t1Re]<2F0) צI]θS)$JyE&H5D4ȹޮk[]/_KճV2!⏦ajU|U_)y-hr@c&dؐ],YJ 7>Jno|  l_R), BgmY {HVtrE+\-$BcQ F@"%g9덜lxy v[O%s**QbesٺNJRL#>,)~a(UhBp6=%$CȰUrjr0LPhjfjшFtmBo~i*^ D'TT8&C դ$DRmզsYOK G/ Ǡ/{㉯$r jh:e=I8DCB 1Fi#´1/z6En*|g[T?as j$LѠIA][oc+<%Ebf/ ƀWb[R,y`{G-%c>6jPdR*@P<*%:g=+\ENFD ݠv7XpGg+ܡ;FvƓq-Ҭ"QPg7r6N ӹI̗9+t8gn'B\JQM74ci ߐ2ܱY_ֱ{XH*l"g@QF+0GZYɠ\ UI-L`1:J!#XdIkP;j#tr8p8ߞ~@]WK^ַ*F*SFU ^u?Y\}0*V=BV!FjyHL`^R+]Ur cWL\YQu0⪒kF\Uj"{T\ٚi? TH\1('\r%WCWZ'.J ꐴ+&XhW\s0Uپ+H;w## ě涫' ¼zLOrMc텸2OWfWϝz $`p0⪒kPUAS@+)A[s@ 6F\Ur<qU%wqU[A\R$`R`U% VjTGWJ0Lou$]RnZɧOknK첇U6GBP3o?at1],&3mO (cAk{[X3_.gmGEGs?Ia9r3$Y?<1_GI~V}z |\d2//96\ /oɸ_g;fD,.fg{NҚGE|z\yQI X"jלSs^ǔ R\mpԺϯ&j\x6w;"m8TJ?+?Rtjt3{,뒑&pM5Ej"`35fc=f$d5xVlP8D$z$&'s3*FE4JРwq>&%EhLJ2-TXBkW=˴P?gT*L д0n` `UK;qUUR9GWF)m+&ؘ=qZJ >mթ%_e:~7Q~è} #9  Sxݷm>S/ H<ƙ33>ª3"{rZʓ4's?{99 x(*d@18RX$ؔz[l;aAMo ~? qzy/Q U ~ɓтį>~s#m<9LN!wiO=ko,pˉm6z'oέt+yb'TJ¯0`k%W ~Ԣ;~TjGįZҋ!8a 1@.v`v1ym!a"zY0eR@cQ 0Z9GV}rO[W;5l; L3ᶾ-U` ysy_jA#v2ee^u/̩RmӋ>_Səw)#XD`͉UtMlzoȄ3dk& Y2Q0z[eFT>5V&R.ڃ= )̈́#d *URE=)ң̞TJ'pg錜3c3~z'ypx>]]qmg.䇛FN}15reYfvIJI|8M+ HAEA hfځT\>]Lv9>++!C(%Z)u43S(N9sF=t(җ<;XYHdS@s!KZ(ΐb9_۝uCn0k2|Ĥir{dR5sITV˒GV_s. NN"IP p9B>Ȫ-J_X)dmXc4띝H且5NWWܖqMA/*]V=bz6z#* \jM%6/QgUWUvW=-:1Czh"w6Ӽ53_UZOi~dX5snå.nwW9?PMHlvf[!ldZvޫÕ<QZ롂]?mԹc<#\U@+rk-zǵ^W`ҼuчdK|Z.-}vv_J!6'k5_Njm꾘f]#&O YJivV>EGVwh35Ś,$4{=Cj-`0E*/2ksR&)Tt)S-LJ[R$U 1x٤,dDŽAvg<dHl욟{d\x7?N>1I# fBLʘW"$= <(2D2&腀J1 @QSJ63r 75,}b]#fH'H궢0n>kB>fy fa;exӺ,.gL6dETG*>#\0A{;( GEB&m_'_&0oZlݵ3 üsFlqY:޻;r>kl~n <30ǻ2 YUx tNBM .6XrlB1AJ8:3 ݖ 461:5#U k%bȆ>Qc+٭(.yYLt>o.@3y8 4Z&>' ]{_wEZ<8In~P>NCu䬶[_e [MҀf3fWANP,2`@̌ذ޽Sdf'pqFk+di( ^5p + kMC^ER@3д9@:D?!'Al8/:_T>j <ƌY$1d E*]@Nj u?H&0Xxlk7h[4{?V2מ狋vnoN0NL;HARAȋxXQ'deLg{~βi{dٴLS VSI*VP3'a!gV0X)RIԢ[ӈ+^KI{"!uL Nޅu QRdT49+ܧty|v}Ac?wo?ޭ!zkoeٽu_:In5 5 R)}̇鯓[bv43?ɍE1tqL^A* DM;keHB"zȪ)J \PH4&Q\YH t*E[fo-C2FkYm n=p9>KK#rRO-$/Fӯ&|Y~WVi-Ta>1S>GXpp5Ԓ{IS?G&OR^ j# Պ,$PBщ5^?y>gWG|ճW2"{YgL`NX[%7+q=ΙRQ,*eF~4_[i|+JV#U},sQӊ1CBu:fs۱K+/~UuNVG4߶f_>/_V'F5{'l|e9ݎ욁.mv韾Оɏ(ĶPon|[3r{3Nmmf}0ɴQ]]xs*V;խ}ZVD2_sHmXEU_YzQ=:#MGXaGaZ{8_2V?_Eb{:AB?eFKRv#}^&lI<DќLMOS=77qpo_}ovwyf of0kEP n!gŏW?NM[ZZʩUͫ^ Ukk^~n>By;vg΀i~|;r*dWV? ,ea1JW )PUL`$ 1eJ2ZTB(OS'Q *$fq;i|;d*ReT3i/.TIֹ$)RQ+pvEFo]`njM]/J;7IݽYP/e_GC|˒V-$m٧2T:9ԽGsG5?7vrݸv}\16wѨڔsv/-̨ݻ$}\notCNo^zn ڰFsKYr GWYcX!ɕ О֒B%iS>Imǫ䑫k?Jy{9=" @.Du&TfT s\bm暓#j3Wʷyuա}&a֡PIYHɬ- Q$N2Jys Ս1ZcL+ :MRHtmp9!@p FrkGr^U0ttK_y%DI-b.Mf tS2)^+>hi%3̈!ef1'6$8Yl<_a;[n5m;OƂ{;ے[.KR~gQ&IZDK±$xijge/BD- a"pbpLEVXu8YΪsE;ۼjs` 6oJ*#1(O.%2;tHF}|1&P?_pn8z!`?slP0#񮜼Xr[  Bt9TՊa'~=vǼM>x g1j&@T"gPJ1&t(jGب|j:DS *Ӵ.&/A2Y2q4㉨X9"ފV2aK(_5Ae[VʂaJa(bE!8 4*9:g(yg%gh$?s^*igɉ2h0.VJZNΧܖw``zyz3R0owFuzkDz _8 B9d,.f6!fSnAZfj1?s߾!a0>`SևD͇w8jf_?Ќ[{4gMܡ }ͯh^ RVROI'u i;֯Sm~.qj=;d3bZ;cZ[YѰi4.įa8R*Pd/Vb >ez :XXJ.NHfGeLΎi)喡J Mq+y?c V(I X7|.vJyxI~Yx{t&rI>D}%nS@I*waVw4vyuOe}#'o'5w.Oc J=^JZJ\ u]ti&Ir#!`ggݼvFc>SGzڞs^.[ʄ=-\Ůgy}.5m|m;Yx av'׵4Χo^lyx7-OQ˭a3;sݦ4/~!eE%9H2VruV'GyaB ƣ&G'VI<{DD(yz`rAJL*EZq=cPAeNDWsv1c 0!d􎙪 1%0Y⩔Y+w)ZyAv \(E}]נ8(#j ʣ%?~JL>B^9 {Xzhk85zzFe<DrS2 BӢr˒ )#uFk텥7"Ixx2{zY ==<5 hQ{v֗^#x1,bRlJdfd2CK0CLQ'<@Xyp>fhѩg' ѯ1ֳb^  ]&ԵHQmn.v, !+ ^ xVqq0Gkc41XIhpZ[gos,'y=_oȳ}:[>ʡi's-C%voY'68֡+'h8nv>ݮ5 $}t&xF>N U`)W\ %+~|4<8,Eٝǣrͼf&ipTcj|y0_~=D?2})EG4?&u5^䃒H/hg#S: ˣI7b|Pl}M3ŚMv?onݚKȝb[](A}Bnҧ 3 3Ow>A߈[L츆t틟I펦:_򿎆Ee[K2^tIE ?OƳ1W x9s:|Zoψ<O5w7F'tի r￿nu8nOgo_ʳsm_yvg"&I0`V })]\v[J*JvZ*Q/sO Fܼ?^fCGBuMwkZz;8vkr%p+veմn_+^W%V/{& ̒՝G݀''`'p,~҈hM8yOEWfYc_F`Q~Eg;XT$q~iU•Ff%x!ӱ xlhʨHEk=MXy{{ˏG$3< \eJ{>ȷKޥN}vr6#}u2MKB>p\J́T88Gmz /ל [ @!"\ZF+Tɀ knʵ&\ZҶBr%s*oU`r.OMW.6諚[9n0ȱcGN|jigdA3|D̘11W a}zKq#=L`+L6F:(;Q%7/BsrA$\\sفJM ĕTR،pW(\pju*puR 24 Jw\J=q52'\`C]\Fr.WF rrA08(Wfoj;@C%jmINBFd+ ' rw\pelzsE {\5,X׸j&U3P{3g+WfձMOPFf+W(pj_:Kt\Zpu9bFY23|+t.BBWpu9)l~hd3b,3JPJjo6zKEkɴ(hf]kJpd\uG|nmOvjKźFB$/-~8b\C5a},T EZLIb%֮>,,5c/\wSNvS`*{0w -irB*9ǙY y5Z:m"m)[o 0"R -JY"1J/X9C+Ykd,U#+'Z.2Z&ekFвG!96 ,D(iAh:j0YQhsOhKh IZ@"pP lpjMיQ.WҊs+lX6\pjچZ~JY%Ϻz*PW(W\pj ;P`]]$5ꥳ[ 6&\\MOA}ۓwYpe U,>o W(e ?#*tTtfu`3-U]MLe߬+WvձMO2=J`޴* PmWf*pubJn2V$\Urm.BLWW+N,#\`˳ȅ1 P-}|%sYsjȄ(22&'Pă&%U4L\Ԛ@%C1-Te+̍W WlJP+i`TԀ <`e6B"QTlq*_"4 ' ڗ N.W"#\`ʵ*\Z|3TgqeFg+,Į@<]' *m+Pi8pyʌ 9+R-vări[7V˻7V)d]Cqv1MOb$'\`P w\Jk\] 0jMw;P.WU}չpcǜ_qɔImVoΛu*A$*L\s4FJDLKil.5"إ+ͧáEgӴ,8,b-qjZӾ)j C݁ڠZӮm*I QʆҨҾSTr2HiU"#\ CW(W\pj;@%eQyr4ZizJeINΠ2|pr& *{gTo : T+X> (7TDq*Ͱ z2[2 6XWlpe,YBl%ZW |++X.B@jX|plzvE%P{#td3}s*vLebW؀cnfi$!-\\r\] 8ьp-W Q PmW4S)KW6/ LFTSueTQf7^cҨ2\iVچ}O> L&]VQfw&IɌfg=4jeQ"ah><Ө9UL99ɬbCkǪn-TM@k|}vpP=KiWƍ["~?#S+%;q$T\0B k)Qavx仒vW/b^Ftg[]ĝ{Ǜmܽo毱| uZէ Jv1#U'ZTv\z H Mjb_Wؽ׻bbR+0x|߻tRVSᤸ+MQO\ג$WVXjN4NDLL0 e)@$->F~F0ή9q Ƞ/ʤcxJj7F&<&G9\8n ou9׌32tH%S$e&jm W޳ϘdbF2B=,oy&qV4qì9!C1='lRјVj3ZXKɧa9.8qXbDCw1%=>b`wd AS5S˯4I~/W~ZvEL]Q+g}Ui&7O|fSd65ǒ{Ec4Y`KRzJ8{zo. VۛQK<ƍU\;f5 :3?or>Edzp$D6FSq)(L1oLL>Fb$o^D."z_@C/2Z">wH0FD_:Zr'sjG)] 'E)8tc|h;:dˣC^bvwr2{ .)6,雷0Tc|]]ʮZ?]܏ };A0suVhbxIS{0Ŕbv>uTɿn`z1 q>^ ee:`)"ƣπޮV7KU+ұՂ_C0՗,tY/}\LzBxA;4Z'KÇfq5zaUjTϷ޽r# "睠sϚDP/4S'͓#)樴aSr))  )E/(+"Q$Frԁw)o.lg˷Fk!{LN O>{>!{$VvOxd %# 0HД$a)B0b6I+Im{)^ IY)& IxT͸Ơ3eZ +j1qfs4nx4K~'C7ث67۰›m`9/G'$CǷ99m+=`h2;[rEQyhU3JP50B;M1)I5HD( Z&Ie,SFC=L )e&`:J  ʣhxIA42g32UaaqW,TPXS,(t-|!̝NN.&a:AѧdyE T;H+G) RIDS$r>߼`jʽ06qa+JDlԀ1I!")68ۏq.6;ں0j{ v>C 6%.B ā+jD Di1i9:CJ;ER2HTx҈51Xb&d 1 8AsNuB0g36NcOb/Xl~MaD4="nb)Qt| &ph0wM\pF&#} Jo:TRTcBs0RaD,&fDtQ{I)>:͒;"'qZGTgF&O$)V@"T ]J8qq7fǞa߭pvKb2Mg޿ļ ԶH(QBDalyF9>Xbˇ7kgqCLD|[u{'1ןp< Qrj<1(dV237L A tdT^d<ޞ}w&:TJ[?RY~ Rq!t+.Zu~'w[m̻x~Ѫrzm%[t"k55=jk\CTLJYuB7_fY~Xm'2j&ǯGP(,FǮfݼήTc=CEu=5unBͽcdCN||C~p"u[#V Pjް3lSޮlLib=٭5 6]z ,%*}Qn+'EQn3{ h; @2A?sDPTAIcc:9%cT)ğY~zJ^H;g=UL`S`T!*ˀ2Fv[БIs^IexqW 'e½ǪG]mOwi?.m͈6/w@-!W GS(mk'P[gL@>W{%02iNYm>תǿv>*) 2KkCEzjcN2N's:єQ ew*?x(4 Ӱbb~!&AA0ȪG>߆ѧT5viw~U O[Ap2u9BYs`m-, /z KJ-,-ThVVW'"k#}%*Q;YB*WZbk/LG˼NzW[z K%IkR2D'R(zL4qo uJ<a U[k4Zel/g0VVI_* x?۟/rf 1}1a[4X{SaL=I$w1jj[yhXNJDAN:PMkkǃI*s hg]o}T|諸E%Kױ$5UU(EpRN\)wB"+ I4x)ʗ0R;}z{ cvN u5ͦ6 cGkRS|T σtID,?'Q-e5kt%S6]6X*sZgU4'k:$+3΂Oԃ @rTkdPܡRA kg/G)Ȧ:hkr-=گ=,)a*]ی!f_Sc~)T]e/2 ^VmxA4 +r`CvWuXi5~لm/M0o~X~g:'-Wo~؍f-P{7yE.k*?".gk)˿lktΦ.|mRm=֙#3چD@W]{R,L^+)8sT4,ђAԚ tN+/&\JJ3xCmJ } {KHY@t4P:@wLve`\ ߾-nu ouWBe.my 2 .zB9Nn] ,mGr48ݮ/zj\ilB!ǪV7?[y7Oku.rq~]8Ȧ9yau{*j.=Z-ٻ*r@W,س۔G)'ЭXD2^3#z+ p֣aT6#_f]`,2Nf$ B?mYӗDɺD]KEEKvuwթS}#<$PhaH ,"AK9m!<9)KQs)Ȅጂ5* %+9tPGOҺFRė(jpKF c3NR/9-)%ܝx )ZX ZDl J)lH!#өXoq+ˢBknM4xc>HVZh,h8&5 t%ʵ#4-KW%vx ћ h\ٮu1 Er1*):b*YuQj%H"SC'v(Jŭmп5M_m,mi9*/#Th1V[u!+0U5$v)^`v_GЙ.irhN6XٖcߖD=_qVǕn)#Q*G$zi;B v:E@r,)ֱ62۲w.+[:e"[d)J#׮$ L".dRT-7E*$E5FU'GӖp7UmίB\OWZ;z?t.N/egug(ߦG=\ZI8Sp&[SXaݕLږHUKRGȄa>3Y=K^՘u,q:y׎na}'+a`#8h}o?lcG]җ lֺ|rc"ډ4.}(1X{7$Y߮Q"$Jg~RՠbJwsHye|j̔op9n<6ϗ ;|gZ.}c8ӹ[$lF/Jٮb@YSme6zmK/b$/O@޻.mQAXN٢3TvNUY+CٵREC(L`wW{m<.o˴ '-z+YB0Eښ$7YY:P.Չ٪4Ց FxLLVCzؒee\RkKDѺ5a idxHȀi2,@d*,W'\%H'+cn-O{i9g<[dPNM2[[ x[NGwZ duCznjprRݶp2ϳOWU ̝x?|"=ƿ* fte#M[9i"ڲPRHjHuZ*N89)q)`` ݂#2$!J ExA#.kq.nn}tvQ6oG0< i[z|J棄}x5vw=_~kLBNMhpu)'EǎogBF7K==ޖ[evUw۳w7^.O!ɴO޾,]Z M<;9|C_/}-fT3kYV0\Fa(6O~8蓋6'usoJ|E'Z3žzҖBZԾUGm%ܑ3OP}R &Q:wo?_~}⽗}շ:Զ`-/CO?޾ij-&ַi ߦ]=~j><;wk h'ܳvDfNwYҢ.Nԣ=l𙈊't5Q^|bf MNFiT*;нt;a0h@Sv9lt8I}8I''% 6-!IX]YG+) gѫJ&#s m  w:.yjH]r$G)TI"h%\J+3%uysRNώ֥`:nV8Yב{˴juqxGJ|LKw-T8fayAƑT[)ośM9O苞ly$ܾ Wb2!%6VITp&`R֎6\Z)q1J vK [LJEUOnb98_9=7<OpPdɖk´3ĖT5.vUZ ;go_힦t P=˷<\ē77_bq+vI167)4 Q]k4`c""L֌ۆ2_&ByZyFj=a!gBTSR!+,9ؘRb::69q[mİIIN8l4'*#(k:$7JlLf8u"v5x*\Y_'><;WK%c98f^I̖>_$z|~9k7g:AZ)8(|<~x6˔mxyN^U oX?K`CX'g~}'ƒy?)Rr $ Yke`K:/|Is>;flw}3'qxL7Uq: z8;{< |أk:lEvG?>{>}go'EZ\uӳ |wO?fK0gQ[^֔m'?AGώEm ߓyNWMr˫KŖ7<ݚ1gXʱ#v5㡍^ />rB4W&ӐV}tR-L?AJ7 bApeo\Yvʒu[z pwBpKA*EO{9ߌ~n\:@B җ :)?|w-W!sKY3vtv#Y`Gй?>9=<^NsѵSORֵ[Hc>ȨM(}}-3)O<fkJ}L//<}]->.iGތb+6kj_fegj(.$R~~`.{Sqiϳ;Cuz&s! xCt0{zfr%(͕S Rk)#F&'bLTA$i\bNghN,x:I&臙kii;% -EurǓi.,djׅkt#`Hedu iiZf,U]}yCԒ 죓B! Be\1I3K:mH7(B'U7VZY )+% U,8tŜ0^j]B7ډj)xJe]9M[9]՘BTR8U5"KCL){$T![M"dk`̘ƪfpjI՚#:$ bºl{"ܠʬVwZt=]ٵz>*!Z[6#H+6 *(cTa6]Agc;>ª \+|u9mӯԃYH%=Da`x}ߗӣ. vZGCcX-@Izl5 Z{A6J҉*JuH(p%)Ȍi[v )¯$[}AGR""2a U"\*]tZ$lȽ P #+ .D; h{(!xu(+-2*KVe@!.·bBw!8еL]$s0w҇h]G ~.Z$bq:"dM6!=]xL=B 3C@l1ѧ"GdN15"?3*;Fg"8a;"!/{A0vbMF`]f}0Oj}L]tuKŒ}bm)ƌc4vN:NydН )AeUO m/9*$ ӪWPڅ>A?ƒ~@6 ;қop`'"0aXJT>vz SdlNyb7 O/`zXM+G#45h@fַx#J0I{z62x}8Eh濣#BA٬E|m6YM Lу!? }b#pnl y3r}zkyLKU[LZoq sTxB'ol4hfc6~̵B)V4^G)k3-Vcʚڈ W{h!R޴W{g[58!{"7%f=nwcB:XwHH:Й>w=gC<b=z {q=0 ?A+I$h7yC<pc؛92{LG]M(Э 0rڎPCiQ#Cu<(JFrpc^ `Bc-pqc mTgc я CH@zXGQaIQ *a pm6W6XykPipВ?AjK= 6 *tXvv@jM̀bj聕BjCL pc:@7wTQsOf6QX6nq3A0}-"PyScڮ`U7ri$7rc"0kMz=^Cl]H'\oq@fi&HHY~=]!HܐB [h#n/n^_T@F!n&ufjlMuW$o_y?:0:ׇ`~[{{fo6x`Zz='>mA  )>N =z'P D'P+qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8VOZgN Aq ڝ@r8&'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qu!kўܤ8,"P*M'@F@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N;AVN)qAUl@B@vӈhNu'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q}=N[냎ZW'//h)z~\Phw?Upq8R>c\\˧z]q *0Zq飮G]`+\8r5v\JW U̴W Hn\pEjsWP\-W)d3#\l rr\pEjuW$W+w`׻S)lWerCWejL`JM]rrvV΄Wle+\AST'Z eF`=\\k~" +i8#\ r\pEjW2%qe}y;{~hXG!C!(ζ͖_ngҲY\__]t|پ7+et98k\ݾtm?\4Tm경/{ݚ(^_?\  {7բcZ_K;&zq8o__/g7?Kk?6dwwoG>h]˗7M0 n~\MsS!ּ~ Oa;_ao?nx} -؋& |~^eȦ/h̋{ػq?wl m=׋f5揫˟߽߶Jo8WJ>$5k&3Y549z)֏Ψ{T'$;77mϒ0I5vƵ94l7GҳĜѹUͨԸ^]Lg8GrQz1|HHp۝cyCۨ! 4]c,M[FK b@r-76ڗHeCPܐW$7."z\AeJp@\]Apv|v9s0+4Zp@\9ҁ]\Wt 93xJ &:ܱ㑀PtE*$K{9Y GH@Sv\ʝ+v?UT lǹ73GT6<e*]eѕ/\=z\AtU$W$Y."֎+"Z  ]\qE*kt W&|.mWtyNMVu^oZ?Ip6l0 Ze.&zLJ/I"13w"/qcq3$߮\;ŵmĵNS}di}GNcQSoLi6 *إVwm}qOupxuW-ݖyuտί~\=^?O36i,C)6C_~Sc㿻͏7F_6wCnд9ƥvlfP}?Go_ܳc;#!_p8o2bܹnͪy!Sf̔{g-#\A0:6"-RE*}\-W'4#\ApP H.\ZjtNp@\ŠpQXW,ev\JW U ^Wl4dWKL%*HR&7VI1W*Zo?H0̸*kf^*SfZj/S*\C^Ǩ 1lpEr+R[?HeR c+FzW$j."_9qE*WĕY:km:Yg>||tκ6PA};45j)kpa(TԂc)U8#@B8);lLQϚ,1E1$\u٘ AcX"٘5N6YZ rYZ )׾YɃ%.-2|3ڹv񕩌Np@\bPpOQX"sVlLNnrp:1 N Hn\pƸqE*$KUAkN |ŐܐԦP;"m1KU*F6 jU}2H*kt x`SE'Ώ"A]yL9,S;GLemc]31[FWP抮T(Z R W$8͞ vbt"6 +"nv|\WW]WKĕv]Y)3̺3NU>Ml(bikLj4&`he+v##\A? &^1,W+:Y6"s{%q)`W  )̝ul6pڨkLfD\%dQ&ɵlpEj}Wr*[,dW$7U v*TN>2~LnAڹ*Rk;+* U^"X+>[Q"sYTjW ĕqVpW`g\\W6qE*c\-W󠭾iqpwZmn-h.F`?eruiRkc&ij'U֘ħTIÂRM=hԓJ0nYkRzRE=FzJCEidXiNfYXyCdbCp|ܐ'-ԺT{M*{)eW93 lŐX=HeV*rX\*+RL*lY"b!FtLjWPT\-W)x8#ɱlvAS'HtD\s]吔c"R;?$+U>*W^"š2ydH-We*Me\,zhhvpEC`+;ٴejW::UUp$21 Mesf+R HTZ/Z S@t2hP}iRF%y1+a^Lndk4Tj*iR$F=\ܹ5)S;שqE*N.Z|rUtE3\A H\09^p\{>"!p29 r+$&\!\::U9;; 72ɍj޵Bv*}]ѕVtQN"1ɝyLގT$Z J9p•UYW$ײzNQ\=vz1xC۷ F' 5>d#/cj?ָ[˸?_|u??qt_p9b9胿xu7_{ #TO'ֻ+؜p?.jznuOG_vZwս<;wlo8oϯhJެlQޅ;wJUo-4]Ys5;vӿW/mn>َumoO<)sZ[k5='{Lx90hT2{4ۇluu [3TE D&JRB²?eiv 0JwS#*eCj +UuyN$\wg۸20-~v$ 6M$ l|ƺ5Z Ùq.V erԌ hTr$W~2S>"c]*"3tY9R)`BWk}w]/uB~)6ծBEׁ~p^@E[+DD(11gW$z)Q4X$ Bd4b I_6M^u/iX=E17B;Me:㡯@ԣw5A<>fb mw QJ"aYIq樻\-|gPuLIMG>-vhqw<{W <ڤ߽*e:RϺfgpj x4LW4Wxt޳e7<Bf3eY]<l-|jBWgy1I#?)rjn_ܸp~|dٗ8^BAaw4l]r޴^E5۪ItNc4LrN 7O!ARcClJa^iGjd~9HysQ bT`?Ù_H?IևffM?Mt0XعCK p)~Orٗ0&/4zOSj]bAt8im}2lwRZohZBZڏWiMZYkЇKKˋ]xV',0Wx^z9/2A[oҗGՏO.f5<^iFrAVy5 3Og⦠7Xc +(_UR`WOP%ZMPb|`hyb;3$KE9L[6@9hT,J^:(x2#BýtRALb:yQ virdze/#k'e0*J jDY= ^aUSPT+m51ڧvD*'j=5-4Ћb|M@qC~܁%7́bB"ln%9XezbFu]R݁Zf7S;Zӹv5&E)q)cHPV(pσ (%W`'(UktQnC9DŽRn&x9#"( -J) ?B"R\Ҩ(Yhњ-/i1{IGuEg0nJ,ʊ%HhҶw} sҕ>Fx3lԟœ^''E%~⚢k-ʜS%*zBr,d:\\J ,j1{gz\0WFr]| Hû/,+kƖZb˹Y)2I'A`PXP2ܩb(C0!B*Caj=+,k1hFs+%cŰ38k%Eb3`IZmO]uJe>&XCL!>:ׅEъ{ jLf0:EwB=r`!pDB`NK@@ӁÃQ.(8]`$S[)rӸA-f|btvtAxњ7 hѲE+k'jmR*-gcr|6*˨$Q[mLN*'ki͵5%%ϔ\SV~^ ̵`LKPa4xt ^yEN"RoTŋ3fgt+ɸ)PNp(m2MbHB8?^]lyVɠ\u ŗA|M@Z,M LgxqPEr0._u̇YχQ_C"ma㦌:fDg=#x42 Q"!Pr2`ʔ ,DŃbEvu!j *qb^2XobA% 9 Hܚ;j XڦΤ䆼HQǼUa$8`""dy:¼)Ǎӂ]agұ%|ؔ1w/.<+'ᤚ#׮kqƵ̥ 7[3V#omָ+J*=F*>n5ޏ(n8Mg5X>.|X>7P)rMmaq^(oxN7wN ]llb0 wlSmнkNu؀bFȤs.5(b-N FCZGbPS0b G)VA]ה1тFQ4`pHj 5<ָ|4.8&`l| vyd-W=?fq߁J}a]YGhۭ#`%87^( 8*1%0&( AC`'e30BFİ'{ J. h \rDJK"|v1xl$Pdydc1E4^JJ8 NJ7Ɲb΂wDðq]o zbZo %Nyp5mxɻC5EڃVVFrDp5oR%3;'fC3%jRp0Z8 E@ &L1J!! f_쉧)ƓcԳgylRr{ Ɓ"29g-)E)Pʥp홧Iq$.wS8gQOR`E$Ap _HA|SYIJ "ҘgD$:ݍ_2X/Ku. RԵe׷BH_n2]-.e>wN<2S#:(5XpMtF8FH+4~״bbB0fրwSI8 lZ8|tt8LuV_I:]&FbQ.FE*:G;1?_ei|XJOT^'@b2`z_)ކ dO{b1V [OA"G))6߇]ÕC~ɍ{M m@dӾ?ξT \8`!H##m8zjOzR$%W_LRU%~{ƭ\+~62_Th'yz('!) ]/~]{_wДh0 p4LAˏuʪGugjhjv> HnCbR@Oog~ϟ`1Cu*A98,Ɵ>~~.2<5,_WIkvMhrgؒrSL<9K ({?zw,V}sB-8) ̊yY-_(x[wu{+˸X,nIۈXh4~VĔ')y辈1#̚'7s돯/zY~\u'\_56q:_g@*-Z{9hE9%t2jt|rZj "ks,m@ԘKlrM]Mg0O6Xl7Rm;bI5֋ mJL*HY<&~`Vpt냗OSFhpRnX;auZJ Cݢma2)@8 pmPm(3 `*&kFhyny<õqaazuRuJ{͕U8f52t<,k'/7/ ,e~[ˬպ 1;+}h\/F-Niޕ6$ٿ"łv^GޚL`)5enKo$òEY) )f0"27*AR40VgEvʕV%c h}eb`xNaQREK/VXTCRSN|0' M9ai TfpT#U,͡Yv{7No­olk©WJ& //Ze4}G28sG2朰 GN EVJEUn/I.Y4TY`?R *} @'h1@s@R{-FғyT3N!aagJup#f5{j\zȸ4KZOs\.b'WO4''y>{EgɇէIȾ)8{Z||F|aͼ3{P=t*ҧKOr@7,ug]S[Ru ĎJ0bT4uL.p[sB(,}]tT \\ 8XPhE0hLm`ցzUpG 8QBL֖iTJyQTB9,UI⬑9 88u}ݻվ&^qN"f9^ қw] ?@mPO t{>cW[O/&\iSJ,+/E]8Pw~x)hXtʐ#dcEoa1(\aAW{:t08 ¯5oa~t_swJc|NKSY|I vEHZHd/긞̓ =B;.$i0 GmtJ m4:^=( z]CQ،#D-=)U뢿oY߀\]rKثկç hm$Z]1XG,Ї,wxˑ@ U RI Yrj#r,>X; ֨+zZn1ɂ96`\d&1"'cXQF/C zD YXf1eHчRX$pd29q އMcdݙ8;bO9`~lgCc/*??l[Sn7z{[L3a1%fZgNoN7q̛sp iYG>> 'i8 Vt{o^v[>.H WsPM[}m>w4=>s~G.<qsŕ\=ےyU0u--| oa3;;x$od9qY㰺9m;P5| dH*W1xUiUTYc΂pKxHm/}=/}fb1?FX͢*,gF< }ǒV0YȹۊL( 0{D.KR8%uQo R-$xN%ձӫ3qvtz-g#3Eid+Z\dM݀&rhy]-hV[FJ91"&%K),:&u H{8J{],-TAa7[TEd,1D{G 2͜g ަdI!ۮTp52IY 8NrA ǰ>5x c˲"U ޫܐ Bn!ϊ߸ر >IIN:yf.^Њr\ +SB<Xe˙~lTdWO^"; {23-30[xcbP3nq #9@9u jAٝYO{xAJ jrܚW ї~hѐ7^0 ]3G2X9^]gB8]TV ?(JSYwo#u/yeȕVe CYs +{de(%3>݉Y1re %gIDN 2%>]V| \ ttyQ1FYyǹOy9匁.( Ǚa9 ;K@ g1HLdݲs1%e0ē`eT8NHHRh 8(LQQtĝ/Q|qE*sю;2N[I- -U̾!bU6Q#&d lW8g! ,ںHY(/kRJ+gbaoq0Dy y%-G-z ٌI$&sk<+Y` |91N%W$#!ֽXl:LHv:ʫӖDzvEKӸ`XLG iT ]` h4Flp DC2BE=d/ӳ#zWYGFXhƀQ+dZʖΚ9=)j EҢ@؄Ni.[YwYi4X@Z^􈔆ȕĂ5SItySdB]8Æ?$SLzk4#ȣrf'hxhL<[L R*RH*>(K==& f @Hm Uh :g-Ffb35ʶruUXŸN5OpaGv|'0vRlpէ:"3 %sOC9Ga:go #%6u@<&$SjNR?-i U'ܒ9P2Lp%'hYOttF ^j~lj^آ,p91cE\R:d2p2+s-{WiGTN?ѵy/Qqj~z.ULJl>[vN.'Qo1# ׫oi8%oޯT=ny7Z٣^o͑?o  #1{?'b|vt 6vV׳s'ˮcm-ln鲭ތ*oI N9W\|Yttr_OoO/ǟ޾¾?o޿GZu@=0H Il<3xeAt4hZZK];|vmC,ۭ q\In'OiHw=fjlblIGHlR&d<%j|_*kjTO!~0@|-Nx%d'I s;)rBӵ K٤T4aL̀.8e7&jvsa^ᰘZ{sF@+*3!;m띌 &9! +2KYB94}Nu8 kgco$O\؞mԋ\~YFߵ_e?+׉qZjkQ3BUͯϳn .qr%BiMFL֒BDNƒҁ)sJFM|6u17$|8|~v|k}k&@;0G2a - {gStrb1W%Ve92(VC&FG*zD*LꖪaCy+`TR2 G2CR' Q$dFܘ` yQcLڃV:MRHtmp9]ko9+|M'|?ǝL&c Xo0}-Yv$Y([v#.6!$0Yfgrgj*a>勉,ɗпc%s&F 4ӇPow`, i-ɺB 1}Y/V6WM%N*q-z&fH9'1$@ƭ"0ɥ 9yH[PjwNKiZ# pzl:@3v=MiNc!j51HDB8(- =bN.P1S򍈀li%leSl|36M :ش9j$f[Nki0ԅ$SC@W،1wj7:>Ȧ!7:8z2I3?\~#(R7ECX#8)ePM(ل+Hd?{%'+5hzGI1rjjƕs*$<@cvLĚ#mb ۣ!q$,310Wg f=M¬1VLu=(д>j<[N;ډȢ`;X+\$JOk*sk ]_@}Z]r8xR'R*^R9;KP=/:ż#S"L@t9m?LI}jpO5M.x|>{Ԧ0"#4'^EF㳭 ػ -(i~N\ov [igK߶R(L. YzJeo3/3ß}kUNWX&Orά6Bi>~xmQop~ A6Dz4]? !o^o.(?7̓iѨ{51CoGc%~ Һ6 JZChs$Œ٢=TR+nq]CqhIzK}s0MZnr5Ƙct|<ר׫Ŵ h 6U*(y\MW7v<.,BF}Lgx LqYC:Ncʠn;^`۟^]eqe6rrnsK`dԌ"_kY"M5Vw e޲ &uD5hQkcM0&jYHimZv#YsnE"QL2^&d{l/eLlTJ=J qofRߗ٫$+2Tf~+RDDžеB^ ]{k/t텮еߏeеB^ ]{!,tr ]{B^ ]{k/t텮е'B^ ]{k/V ]{k/t텮еlA C _~(pdlj,siu9?HXQf zT8S<>$&+o<ӥ2?YԂG.H \ ϙRf8%*}Pڃscd>. P"lM!f lp6HFZ{D Kfvyv i5 ^Ez:lC'{ۉc(fK^|vY+;ݑwDT63ڀTxFxFdWo"ͯ±|^ɰ0._Â9h\UcQ0՗×7oC6A/aXM\1BC3.^ /AAǪop8:\1(dk\>T߃8j󁬹~- [v^pluG?ǡӤa ངqC%%7)&JQpF>!Gbmb|=omI4$w8.e J yڙ` UK~\(OO7[Y'=ijh{G\0v7-sGӂ]QVj j7w:Lj.A=!52cSkY-/ϏLWrwJ8c RK!ܱ-KBH/|4|aRHLG eZRZ0{-#cԤܴ'cM Baڞm)h,ȨS&ꑃP #K ׁKM (jxTs`.0 F" T),VJ\\lz0nxpKU~h6=a~(ɜzcVt|v<ꯂo}$ :9mlWflY;do} q9iUhK00iIyUi:i(rQ (l%ZHEIZ|S\zB #xqf"Tnd&ȘWɇ4X(2c(XS,K6=I%4!M\ڶj3GlbRbibV Ls,9"`hikeTX} )3 j-$& fh/#vHH p!rܳĹhB}AlqSԖQ[.޺΁ K. Fr)!4Hր9$d>H)eȅ4 "d0( k|@N@K~`0G%xM GauQՀ 6dHf,g2DOELIY,-,xGI4 {p6q6`Xp͊ot>twk Fo}DpRֳ|fC0%jRpZ8 E !L1J!! z_^~^t-VG!9CW u-(wd׷W  +|[q`)BR DUК{,&:BP #H#>hS\tx4f~(0>UOiAu}&8qemJWќ Sɓ}۰?;H='zF^H4cA]x:I'?x jw2i+;q9/d.AX1ׂCT}޵׷;ݱ~VJI`d1X ~pʓ-8/w܏bTݓ_gn/ϲ*#o$O}e~LĿƸG2oaeNT}}:f;5uap}qcwbXfw5)ڡX+c%h+^PWjb9fPof 2iDkSX?0gk4Fںwga! cTtԨ{s⢧;PV;p EKP+IЗ6Fo{fm̴a{0BRRՃɣ(UTi:H3|§O&I)aO/NفV_nBv=l;&fT"3ႅJ'`ʲD- TȃW` X4r9LzLzytV`: }ā ~ӵoJ9mSҐDYچ0+Yڍ4wiј=w&ӘdWpxk>~\8_5yfo "G~z/9. 7_~ ՏIsԫ1sg(YnJϪ%{CeUG4^iė0e1@G=}7XwBu7)Q8 gs ]X-59T10`:I$ [I, oMpol% @KU-}3&Ύ BASGS >૷dž`b$(URjbsf$Q4ᨹCSlbk#څPۭ^5Im|&@ϼa:;PMCC$KAtyv;谤^U]àcc(! r8(AfRSaqL&D*j?B#*_=|cB[9hq3h^7s3Gm7NڛosXe|oXMI>øuy}jgqQl~;7W_ߏ'[ErjnK{$t= <^ـ'3?^d זj /LmYF+ R:@THA*:f hTSb7j6F\YH[L%\*R)"Ē. ̀L9\t#} 8ˁZZJS ,mPSR$*ɭ58=qV W7tٵ7ƶ~WbCAzMV=ԣs;Իt5Ir\Nn.'㒍/AH'BH[F*ig݀њM .(z.<d2*bNIpk".Jnee ˲|}a/zh('2W(Egd,G.G$Zj~)߿msL^+)K$'hv&1Kd)jM D?!׹H5P21C7D/(1gY*dDy%W33Mmz}^  az]]% )~ ]vQ3bWi>RlU?OžO_Q⠙Wil"F)ʔ_t)MRy*q>^N>lȹ۟C6ůd[j[؝s r̀AW_mveV83:w:a^2f5W((:Wi+ƨ⿛[V$ e_^Be0dx1hv^Wk`S=loSjc[>;UyWw$Iv8 `ff+;:pOm9x5];hoyhuE,]yu7_tG1H٪\W=2Fٛpٷ뷽{?>X;]yxgf8nc҃Vd;nxKVԷ,-fFuJܿr烷1P9|E7C4kVӓ=5 ѐ$2W-q4DꭑƁ7t<R dM=*9$.TxwDsS2zio #124G@MWA@[W-e(R1K%2mEΉx'Fr(;(C(U3SÒ$] q.$EW^:+IR  /!(LrФD|rWPOX rR[ c(l-1D9WRN-?L$G4$GsJE XM|v$>薻r󻞬}\119@DC(tN1d(q^8'NqHI쾛F+VeZN7[K%.|ꌽC>^zM8hT۔Rr,G0.}N΃Y'& ۋ6k]ᛧbv<^oz2xw'`5ۂo(࢚2qZ<S4l <aG> Jax("8J8 2 oo0e(Ix>:8& }@L ,P[]9 r%wqqG4((62JHob&#sG* ԀFU8D>pJF *zs7PlC~@q~ybv~Q̛kIku:{U7L^ÀWjnM 6s]r/.J-RWJP~(eUӒrJ!GS搥:2RdY~p@deh$JX) IpJԳ,\x!j%WN֌{y>ޏ*Z6\C0+yyXm$s_)xajʘ gMRP)Djw85Y*E[̩!lKbr}Wʼ-'G v!C3qMufwd ٮFJɖkJ;󴆰ـL347 oL5Ϗ|x䥛]$ .'~=Ab {cW?'-+mеA^T)QJ!PYFo42Ǝ&w݆*0<1KуfGHL/$]P&2ѷ = Sh-Ӥ 7*oqAkˤ $P4 He/g!Oqoךt \t>/xNTPR**Vi ZH{o&P V%'å_x`雹ll= \)mA@ɠ`MFˬZX!c>3C4W>45\+Pn"dHsUuZF['B&HAea?x$;AȁNzeL\C, dE?tq< [6ڟFp.d@x*h⚺H%n*3>;,>ݲw6A+rUSIKسۿUsѳg-͡k+9|&.aݷ=fՃM縄tq#b㉚ ?+7Չup0Kh.Fgur5 OyUŹ/!]-eW3;Y*oQ #'Y٪h~Y*#>dWϪ%QVg9" %L!z1+9puBi1CbrScFtFW8ޞŏ/~yyßߞRfN_ϋӷ/UOsvro~O4/~ڽijkFҴߥ]#w:xU[\Y+]ѬjI_ m z5䨭"AQ/hVT)f5e~0t#ViK|NE4])tLQx3Hz*HB 3GY?IphҞ?vsWTFVd" kmHSdTas`&aZfLk?CRe5dž%lVT}]U]@Lj* %Y ,b4s*:-.RUO'nlw9y.]UCDϕiX6`" NJ!Qk{aiFFĜ9 "9ȃB:*w*TZ=WJ.c`2:%]~R_}wi4~l/7{d>l,6.0,HRHfȎ6.oi'wY`xy O?{7OgW>篏WB+{EB !񓛬xR:6}ҕ—yx>Ḧ&2i.a\*:'4JpA9g])]D mue*Ԫ]J3$|{ tzT֔*%8Btnq.&sQKVÐ?=*(N}J:ozP T$5LeCR' 9IYS3!c^i9hB hc aB^ɁE_9 1rO8Uf8oI$ap,( ^Z+ʙgK#%) r &m (I"u@Z97aTܗ8_Gc:wQf\JJew4 $cDy,?=0([Bd 4 XO!"`Fһr*c@l@rTjV\L;iS3x g1j&@T"gcLJai!)9r*ѫ0j8Z 3iC*  DQdh䁬0,gTuKikrJ9fM,(FLDiDT͵$KR9#c=]bz{ E3ޑX:swm\L79d.9b1s W(RE|Ȇc ; =cb$ AHQ`Sj5 L&ێY8#bnkF"zi01ո㶨m*P{`&hm\Ɩj:;RM(>6%4"ILYiŁ,dґ8%š,jƢWa|HYqIF5~Us=vF};',DED[퀈"x2NYH@(fOJlW`&V9匄 6FLƤLD.8RPFbI/Ho8DbΫȹ߾BzԁpmV'_g5.%.JV݀.-F,xvW *g"҆9оJ4xp>p7EjqGOp5'=@EIߵ <1'od| n~|G+=$~LbNXg4Q75R>Lp~IuyF)C)PL)$RƑI-f),.%"YND֭yzoYL'w#Zdv˜UA^H9L9r{:P'^~>?Z)Z ;2Jp0-85?:!Jј< ½ BuQ)1X]k}qBe9aL~Sp ! wTn)O9V81R(N).pLQX;Q97(gR`;gvՒ:}Ӓ&_ݠL}`v") ,jfzouaP!7?j(EORj[vp=4O Tċmok\|lWxO%4}%kCbJ-Ԝw3]{(ZMkptglWU :;b ZIǸƤ jƢK6˜bO=_v#ygPUFz$hƳ4]_ERͽʙefM,6e͵ǔ)BNхl]29Lxf?UmϢVtdW! ~W\ a\g&I;n#d -FGƕ1TU14-.^7n[)zR^{:t+RI؟4di?PӠFlXpFExqڦ@6o-B#ҶҩO~N}swꓟSVY"<,*Uz!>t:"3\pUAu<ɜ˅dsȸY4 U %l)BV'HJ(|vZܼE:ч̢9WSJM}DU-ڶ/=ԶqЎJ; huٻ8nd4æ%އc_}vq6kdf9|==4F=n 1}"*V};^2DL D%hyh`E%>募nf+rl̏r4 9U{w?8>L\NệI+,VaRֱ:5%SdGɰҳj[g__^7 *z /#.i>Nc49u[N^t7<@Y.v:)~?~Ya'͑+y5?Z\bxt6N(^C.7׊^^[{Gh*ƣ8/z~WϏ1Vs" ˏ2CMDJι/ Ms)BC ڔݥMq -څ;=T7Aۓ U_"z{4vz1q"TKqXL ?b>*Ym{mn80lqgsYy/? ^P_w_k^-Bf\l{6Ð0b7 ;]9j nHK i˳NuQ](r2RJĖ^z[&&3{6:u}QzGzXE5VJD#6QSR$*ɭ5cnv}{ٳ2Ķ>ߏE|\;~(UhwzOe{w5 _J\zDe2 r\NJH! EVrIĂVK ^Je&D !< 6.ߚNXXoΘmxMv커Eu c2S|P A:VE'&|V $*p hKerG]™h%8ld՚1 ;TZ]7/@ǣ4-=[s.٦+eDVj7.E#qCK,7j co9XE帣,{ibs&E;j h&d3mGx4 rqJP ;祵[}2Ĩdbo1xA༷THY"t<BD+wv&NDUJ kxnq4Ak_%Z&1pğ1s'FÕom KvsśYj .ϣ.:98rREG& T Fy؅x+S R8IK(3Tk.^ޯ )6?Gl_fhMX[/\AZlC+cS3N8w:X[&ux_QAtTIs1s)*!l[4eKɵ,jc 4!Mך;\=lg$oM[G|ۑv\8 OWAeװ7qv\}ced>nRݭC_xj/U:KW^llgɤ(-Y>Cδx }zۺƝ'w8ʙivfaㄼ}Яgs-7n<KV: O-ryyҸ=qWS==qቫg=mOi3V ݹ AeP4ݩ%a~U &Pod6{"IF6l]E6tnc7,jfJΩLbl.^%>sx)dΎh#S궈6Q4LjM]1q{cAxg˨Bdj\|Ж`I:h肨8K2h668?|7 [@o>]bp4RAk9NBH6q*`A+j%N(g!*s& A@yu )`bD)oKgB0OkrgWA7\N> WCR-e"RBb\nRYM^V9$ըBe{eه9m?f/\iג$]<$EW^:+I@b x4$; ¿S ^V>ѵ$'$lHKxdd:|a; -149J& '|j`~4k vp}Q67i+kVȈ 7_P2\EUCN@$QeZ~?|;AEKp\C,b BsȰԭ"!'kG%& A@!B)*-h^`l`Rޙ㶐*y.| #(>9 .Bd`-ff˾8-F8\R95]>)A}l. 5M eD-d$'x#jn(+rtϩGn2dQesl( EbGXG-AO!8ó:lz|j1{>bpv^,˞]q@8VqNpѵqR֒noI-]5ڛա0ᨚraRhxiѓeUFnoum>5SK$\CrFcȷMpJE#Z=XT`X.`puO{}osJ9}?N߼W?aL#VEap}ݛy܈4-|Ү[ڽ>EA;Z)og:EiNUz͖嚉V <]C无ب5c`R@sJ]h{p) tVsk5(Nb{$+Ig8EDK,@5(Af}o'=a>啕|>Dہz\Q%!T[UHѣC8Q@@$q1b{utN':8] OhgcxBw8uw]>"wV;O#vok/x?|<1I#!D)sJE [2)']Mw:kY%=Yy!1å`9(%Pz$9 IJ&JƤåM}Hv'!8'Gv.^]$vh(+sr0wy][Gac#&P9;ٖuSUէ,GgT,2leq _,Kd_+ ƅ`bWUKXGP.ē\M͋81` "O$?! o{ףer-3e˫wnW ?Mg3Mgqy.tVC/5N"hBw5qH0|]@1c/{X'&ik, QoV"iOdf?uYf'BVy3t%p#omNNW@I6(] ]`mJ+vW~MЕ n+t%hѫ+AY*ܰ+@u] ?@ DW63;A)>95aܓUp1bXQ:; ݲ03p@Osv w͗)g\i喝ˑӥR58 imwՆdN[) l!rm#3אIVn|n՟1^G\6cgiMœ1Qxb3l߱?S1Z*4\E"ٔn(睏Z''nyFCiOo%gx-uHor{rNָaMq2"%jy)7 F:4C)4-L%bua-6Qćrxù_9o: qzc %G9cN"^xD,=ɌahƜܜBѱ͌g{ 3 <̙~?Bmb#[C#@~Fr !vhra\q։WcRrgNŬT)Ζg,rc ߙ4GJD^7M'ƯlRLE!lY(rW->Z90,UXFʬj6Q'K+cF-x VlhI9Hnf[3^&lDFSl%[Isi!sT0 Vv{_' KFQbcIٙi0dG)Y&Ş5ȄHt=k)!;D{%} j'v Δ|YP4AZeZ<`5r-xAEEg `0O4&]8w U Ar:CPCė% BJMp5 ({VeQ\Zl0=Y{mGe0Hn HJl@6}jGRv Det((.qt @pŵ?S\D?56}'CXexˆ ̆ eܞ wk]{ZJU4r-@3r34ZAe ¾o}0k% GreD&TD*30Ly>6]EZ S!ZRUf&1,:I. 'e>U@J$jC528Uc `$e1F<(^RX"ɚl!JPԤx$Df6qρEUWUSYߙ%72bΛ.doQe.C?$n7.!Oa6fi2}@@F$4V=R_nyj"!-]%@_141\nV~DtePpή%|m9!+bhǎ@H[^BMi)0hKٯ}A:G qVyBr΄@HD9- č2c9DkexLwjTP=`(mEA@v-WdisC)a>֣[7I1셍C`_V$,aK}ZF<$c~'CcX 0jĥ sR5 `:Jxƀᗳ^ :H5-йgQick1ٌD54'E?1ɅY 3^VjQc %ߒuCLFdn_QH5Ok% d}U<>@- yn[dQi` E̫NMp 0he/fA,Nօ~BS+.8ٮZ{:uQ!,eJ]1@b pQi 6|X5fK/N?.MY b!S ٱ%Ijkx* L](h`4&6~H_eu!(︧5L0H5)1^oߝϋ۽PkhCRusGBV1n<dͮWgvB!`d+ez~\v1վv3L -CmǸě1.}? %AK`\dSHdiCt%C ] dBW@;] J7"]h ѕNa3t%ps ]-[>vG=#;Bؿ8]8tu\^CK/ta(ȊA>XCoc< ] \K[+Aӕxtut aK ㋟v\[+AeJW'HW$o?LK?'ס?b۳Y/^¬9>3[O:/ξ|}[;z53~gRdg9vw>vhm}C>.y? YE?_?o W?+]cv#Ûf]zeBWiD#p~qo~~{v7!s@@kdu7!I˟^}]!;:m럿L~ǝݮv5ۑGxVlZ_<6/сm켘Vɻ޾ Ӝ5LRMvQqaK;9 s|gud]7{պn=ZJT;ߴ/j|e} ?<+zskD&|,efͲ]]#=#|6h76hAfnmhAP&G 'hS#t|`~3 76ұިKJW'HW>oyCt%v֙n[+A~[ P;[ NB&-8ЕMi+tӕ^*f햊AfJ ] ژem1HWr^ڒ`]|VJxt%(CRb?r7Ƙ__|e0h}]_a(JWOzkt ѕμ_BWѱӕ$S+'7+ ?z]n[+9tܦc+AIVP ] 6CWw;tE&c+AH銽KtT|&Y|싀9 v]r>lQPli;E͛yf+NӂXith̖J1m^LPZUHWfkxCt%7p7 h);] JNsDWy;t%p} ] tut0VDػL[+AJPftut]L6DWi;Ϯ.oٕ G_ ʘ !归/}ApɼI/0ȶdž*(]=ue: ] \[+Aj2+] ]9 .> p+VJf:vJwlڕ"J. on[+APJW޵q+"8Yo i'hrqYr%ٍ[u"Z[vM;]3̐CJHHy"PrJYۚ_R7hz Syek>(KxGn">xGUljI/޶Ugyw~V+ϪBv46ݭ6fJfh!ղ-C dtC G8 2`ZCWWh S=g>FRRj֦ D2\NBW-4޻(JK`MU+[ŗ*t(ut|t(Bl ]e5Y|nwQBwteT^"BAk*=zhk:]e ;v%99<],}]=> CYi̠AWv=UZq" V+K BG2NWKT6yWXU{%]Ђh:]eMJTΕnؒ.ZP#hz\)ex1/VtrW VPIƘUZchp# TjgL& 7,u1 d Z3 p>;^ M(Dp@DRM  ZCWmt(%fǏEϭMMltS{ͣՍO(ttut)4a'zi ]e5<i|0(fǏM %ZCW.o ]eB72_]]kh.w;y K;eě%2?v2^\Nlp;2m9ÏSť7FB.xHl:tXP.>݅lY_݌Oe/ߜ.5q99ɜ݆>mJO6~hR/SgVyEN+*WdKE-rD{Fg]e!VZ3RjAsoΡ)+ ?/!t[&ӕțUD L 5`(0ETFDYPA lX  8 /2$eV֝_ZI ]$ThgKjXC/y DsvgiIF6j˧D<P-+t0T/\1|4] WojΧN6lB[q&$Nn]t~ۻӢӽk;0/;?7_ϗ^TOpR a8NśVHP?_e"{qnW;gm55 mհB5 q<)9 `~ْOƣՋ9k2rFnuk[$j],i\iHX|?+ͮf^w>_hr+7X.T{ U((l8^9^]ٿ|^Qg/٫yh[!'Ci|R?_{ՒjoU5Kr^]}6yK,zˀvזT ~|/߼\\޹hnm&Fte^X}a_"JT )%*J  Tdgʅ1ׇb 1ƽ1]_V>k$I7]!XneO⩪ /βl /9ТTQPd0!R,zb%3E bĢ\Ge^gWO2]ak؛Lk ;46ür0B,a{B3/LZ$`1@CG39+BQDՎb8@*01 QB.r0 P<`h.P鍧Jԡ,¢'<eUTO{vaGqMru6|4_} h.)~\\A5$}ʯl{ׁYxU'WJF_ݲW2q,tީA)!|ڃ#kJrZмu*yQgAց$;G8,o#f8sN w$J)$IaP3\sɓF◨IՑe98xìSp!w悡ɌbVhnD im"fώ옭4uɇ9K9}aZԍeIS"̲qt*Gg2z1'.h_^Q/M&Ux3-B $DB#)vE yK>F1wm#K ~? ;ٹd$ ܌D'jR+-[N B6"YuT[Arɞ>t+WLE65>Qeo`stXNAV[.4]IZ(EҊ"r(,y:ONw:iȦ! [AgMXhpHCbF-ȡ(>䞂M,V6V6ũNdRⵗBUR z݈~}Nrzw,w9.^8/& }KHvionI+:'<2 |,6@ rt@߃8iA ˶pAe5/&p,i^< 4h$qɍTyĄ†ZLoNM$ϸNpӓ CX_ۍӶP*+ozF6w: w.֟)ՌjVSoPA\k=mZ#9oH(Q41j( Z&),N1&',燅\S[DŽMJ¡<*K J#c1qGr\ҌMPBcVL|mYf+кmr&L p88}m;Fl1Sx!CDjT$r$HijsڥbZ!.rRm=.h੓2j2%DqQ*LnWop&"(06pqg*˔ҎaL269F5v5ֺx >)C!+IO>da}C>z[׳b~VN.19`6ۉ+ߗx;ia(s;) 40Eh%Sd[*\,6~_<7 ;!Ӑ¡B"[fW&fM>-ILeT8S0Gگ^tڣ$N?,bo03MFMճT]+r~uc9Tx69d2{P>&=HV2sHi!(7?Kы' +qʹe%0e}mK\6٠>LWn-{|w˞()]Q=dY)_D *#u ,ɁѠua+uy:eȊ.9B(C,ڳ?mJɆ[qYEg|=,Ģ[~n`|My7z$!F0ND!Hy,0RJ) ξSxƓQg%! ܓ5.7#"E kBҹ  24GD~Q={>+N}Tnx?Lqy8KJM>rԶ$ ''e#MrA m;,Dn6 ʩ}P4pݳsl&+~M٤˿8gm@|ikQ|_mq}DWhM.w[ }sIV??w!zMpD 7xuRIͳvfr k^⭶AZРGԵr}}9O$?ՅEc*f3t>-ƽ7$OƓýQ\lz]vmYd_^4~ʁb?RFi6xf*gĘtg_lv/7[qv .6N-w>c < )T~MdnZgMiEm4CK̫[WGlovSܔ8.%34mϗ4Hߎ,ܥzFyagas^+](danj-4F]$JAQ׫5Yg'1f;?cޛNML];]E })Zw.\p%V,MWdwn1ϱ~Ů/dm^'S"r/C渉IQ &A$>`}26S1F++Eț^$Pa^E=UhL`!0I! ,K1 y  ezE@/k`Huɮ( ;R,/yz8J×}{* _*zP{W7xrPKy8-рk'AT[gL k^ ))SDoQn&_tNyJ&uHk !gىUi11U)2ND±e[PН64h,@{Ђ୍c!D^9o|/Ĺy {7DW$WgfzqZKryr$yYl8l+jwɱgGĎUl]R- %R m9*H1yv)&m꾏kl.m܂"IՐr,ݢJv5&h_Wy9e+Fgeq>7Gm޼bj8m0#USg̏8#DhT&(^]Yghqbu*y O\v 'RVܮa[&KɶgZĴ4֡'ӬaMx_`RHǧ3#"γvu/z|X]/>;[|,#i%vAj%h! <#pio*Ohq%-)ۣMJ8.p $NM$r s؇ y=L?)b\tÕ;Zw޴|=Qũϱ G8 c ЕxT6%VފPi-RрyYӠ,1߹Ģջ/Y3Dd2EpRN\)w! E6r(Q$4x)ʗr΋rAEV6_@zśew k7K$<Ջ,K)Q1.<ұ6%h}NWDFR8;.+3&'I֌ɠCJ;[L-x)x2Oӊ}־p-)\޳'juyϲ7ov< xyt Y ol "rs.W{ #ZqCY9ĜL t &dgz*Ј4y)Bɨ9/e'k@ 1P ༷T,:Q"u I!aL)PK1J)Ve9MW$f,EIry7䂔UBuToƮAz2ւdNn&״fd'G0i^h%7;9"הxejwN&)\5y*1&?\VR,Uoqt:w/=ٻ6#WdUbYUYPx#=xbmٝ EG@hj$x []2;+_uS T჎¯+ _~$)W.'߼.B=wR!ݬ$' ):!k>5#kAo-Cz]akB6:?~9Wai.7>v:B6s3̹o$v#BiD_zJW] /ݎZE5ұ _ÚkG0R#,ڲԣMy6r4ﴃY;˗}hx\W^:a}[]g<9J<_sy&:\Kw]TsqCgZ߳XڳIWS;2q_#7]m)3dڞ&&8Rd<6oF3KW`3Rm3p"R.1.%b.2?{#\=lЭ*>38Z}T,[`&{,r HU>FH b]Qr%G]J[TnBTV<)4pǠ&e-$DI'h2k&ΆW޴Gh9X ~X%9&8ݛ?[w]]zZA5],1Y+c^+J\$܃θ¶: x4AJMSIrc!S„u)lSn&ΗHYOƢ4Md6\F{봢 N>kaɄ| Ʋ cx6MOgn4nx>)EN9{.((!!FFעӿ\W0 Y8E)#=rն#_dܭj@zfH=QbN8<\Q@"#4O  TCYgKGKpHЭKJcEVXT.@K" bk)Wxm&ִE^\ x(֦'Sp"&Kki8G0Z&>')Zh%z6gG#lvzI~P㕦'\WԸM.(rV[@a%] ^LXvdXJ;@ JZEJeH}b6hm46шk+ l҇/i(m`  .,45+.)д9ȀV,o[/N+/)mW}Mt#CY$#Bu.cGJyҺ-瀕C#d`c=!sm|ɠ!D@rYK `jNBGCNgȕUz4_4 ofzțh˟5<F`v.woO&ח㼝~[i],Ֆi H)#κ 8W7eGLyb6tݯiC/;H7y==ayXH<4z:}7ԝ{6]h㓳,bML/gŚ7};-$Cw4[VWݚF 6Ζ-D"<^xs|t|qڂɥ]T`Ef@Ĝb%vIzX<fRo nd ֒ ҳ]L?t(TN }~p;mb3PA;v>J `Y@acAGKAqTϊb=S[tL^[c('L*)M"4F+ЪՆ=ĵ]stuF8e뿳B̺7e ?@zR AO l$;qݾ$dҹ!JװG ޸+Wp/J+UUw/]Z4͏߹*.|v|4f MvHrv gLq}Wq3?y5Og&My~fBiGܾ.ُoP*GnF?woPez嶺(7ݴVN}x#%޸*}qWUZ*R`1 E=O9;޸*Q⮪KJ`pW/]Q4"AO0XUv_UvowWUJz 'tU7*=R.!]w儕OϏc]}HmtǓϾ39?u.4`׮ȼ9OҬUlb}fN1w8zyzMwڥDRoVnyuǙ$v/Tu7mO=Z\T'v߾}sZE9M&F[Ϟ%\aQ% UVH!یւtEqvtul|r~~q53ϻ>~_4im6n1sWoOh_SOo5~B'ʶx%m(+b(,ڶh+|R (wzf=/T}"&M*> V E.e-d A`2>Xhi~vR- `PP33*+5EPfZ[@8=!N?Xc4\z)N$x )s1i=Iʖ6jpM$O//L'^7U.J,#@qSxEFXDфs&)E]Fdgуc߂~`aJes "h VZ&^.@.'s#D݀bcs_<:tc/1 z9kaٍ !;ntݹ Cﭧ3{TiNp{Si^{CQ_v\\6TJsmPKgn|:Ï?5ZRbA+@aca O`y7uuRzT+㶈"4 h8DN.tA(& LH\J֬x:b+ Lbt< bG_,Xl7g}$N=z$9VD.V/bw]ۦ/R`RYc("6ˆ)oW!)>c&k ֶl"yƓ HDLSp"lpW#D#eJqQr<gT2Ft\/*Y@"3p|M}9[Vѣ_𾑕Tzh.ېГLO͸^50Ƌn14FdaJ*}/!' *B${=נ݇K-ge27HCD! *,Жoo`ĚLiFY3@a#2Ys UB6KYGE&;MwoEvg.Ȓ34 Ӕ2xJR4m@M )CcfwFlXOIbբiutK2t d@vkintD!%Û $B\< u3T`! 2zVl7ꬽnx_@xWz=r0*w$K uwVC%fHKҭ }bZ5l)K38=@dbDTx`xnX@,Ok'WK>9YHh}֖Q2@uٖ @]98Lh@J(B}J>=rq[zل4bpfMhݻq'7? #b$qi6SNl!τiA7@?4`IR2adg `Ɛ4<̵+JG=} DPAvb!{FL:B4&KUjXEh m>gD9*$2_,d)9[*ISܩMflXTvN6iʪ=VjD‬3uDJv'AbE)e/I%~RLcVlxܡxəUӗfzȦ/qӗG;?_r%յz|Gn6,s+ՂB=%m7ªVl뽯uU$JjsW-9Ar*y,/%붝 ڒR 4 `*JQ6bDmS!I'JOa/ j3*|a+x/4}|N}Dަ[;Y_-&'q^v`ӓ{l G%`<.D&(HR>){ !KP3Ŷ-Um$"ޅqNfvvp-3FЏjGYpwjeG_XIL6fUrMs=,MtP+5Q'툱 PsW#"fI01Ekںg= ؝ d6damrBR*Y"  WԢ~@F,rT:Ы>LRiH*BFDZсF]ŀq(<|LK4U7rnևQ_l6h͠ u`P4|h 0v.V`D) )Jm Rpj`(PTsȩcBs_'7rnֈjzq^uH޸zwVAqf4YIM@eJxF"8(&HDT zqz`j;jz]pVϑuqč#)Q2~'Fq-l,UAiO%2!*aך[\nͷ+ gx J9AAA)KdΑ$m:D"EXL<hY[Xoe_|dr󱞔@+|y6h.5}?:Bs8fzs#-{QRh\TMFEc?j *#u ,ɁѠu4^4ByT0(?O1pQJI#VZ:ceZ{ӦlHE#;KtjtD"9o0RLprc(wjO8KNP(9R4v tvɇw(N_ diϖ v},l:7;pxoxE[sR,W4r4Iq b_gM7YkX}Ug(^Uk~/4>ֽ^q*o#;7h(Ǎp5,կKJMʔ>r$ ''ecwRib2$8 P3VGf"(ϟu%WNh@ Pb4>.]1~Y88#?FrodGO"Vu(Y}(񛏛k~o=if8 "8A|뤈o^D[ΛĕuOg^iqm4ht*oҗjQ\_˳3y͘ j̟!8xVNޏ&qu3^֗nXVxZ2]̏7F;/E\|w6]\w`Y~A OG?ͪɛV/1;/oj9,ChVڠuu'7-^𮹾c(~Vu|C,٦ iSsu_^suns:aK14Fj)EͲӬ.֧8^V H8ih4As]%uèxAi9Ց෣kό NFoW/$e]b~1 x2dCy1shή*ҭw@FD~v,W+a7^eͤZF=H58f]3& ߜb'Tmd#9؇0[NlO3&}kv!b?E?2H\"wH27|젨mW%M7yt輅xm=+^my}ug&Nx7lbW;v;n 73g"pK] hGneåQWxSq2%!W]aUbDS ֧\6G0U\,Uy!=aK(NBAf QO `ʲDurБ@!*\J90rPzM&${ug㚡sq|aHX7u6fzi&N' TڵBIuUW<p˄C}?Y5x+/t9N}÷VFb`n ‘zAnTѵSrYI-<9nMDYqcSaT6hc%C 嬲7քMN=N<S;7D M4qL&d'wȹoc\a9 aR&*f`@I_6yZ"zD|T9Oe2C{ ^P$I+h-C I땗FRT1Ä&>4:IhQ ^A5ZG\UABx*d?v\Ճa֖1w9mG{rY@EX&oC)2+5A Sapoks8{U8lJLցhx#0*Dy {NѮpI,~3ltVf^rrm4DJR0Hp8=1BItF-v7_XjINqopVRƙcFkJZPDzDZwH`<ę)$p$ńļBZl[a=(ب . '(<`:Fф\s ěop((W* 6֯8H'7LFaF'e]NF:,*-W e 38DN pYTXd@ ?(hBɱAE; |_swnA[4S9݁7'*kVJ%"r1SNѼL X)8%Z\ yG˄C-n[{ǺP;*f\%ĺ>5n'gXA]/P7D>K,h`+ly "$y;]^^r|ΪF|j9BPP:&ೢz5r|׫oͯäc1Ӿ-5>[|6v= _^9(HMw~z(w;6zଚl20W!e"%^:l \bI;pAvႌ\e}}Xl 2^'e$8uN]H!H$bA[{ʥT+o>|[=`]6oߙ~nv],-6ٿ;0F{, Q1.<1%$h4ZX+XN@[8J0kt9߄rR*43s@FVgMO&0QAqBq}GEg}rx6u^\K~O4p ͎z :Yb(v7֩M#b8]q=oCc+ׄAIyFMl k<:bzT`JWMM֚+_LW[>IW^0p(Lɺ \ U 0#kL4 s+ǩ"QcdlΠFI 62AcíP4U;XC4㨔?2lR:'R1P '4$"H&#p .Hk*NZ4*8 P!VY;ukkH#.UI"ɇ⪟h)ps'b%_c}J704.0eFɆɕ*aֻG0Y&ECq ; e~Flc@J _S)hB7Q71p_%P0 t 2C(X:$UX!bh*s'Uz(I|ŇqUS).MeJohS/.?*ԜB̧}M:&bֱa ֊3XBdJUzbGE}{JU›YsvWu|y~T\ո*4<>3=#Ъ_Ūff4]̼iCP٨ea7=ٻV+[%V'\ʾJĴQJ I sP.=j+)|q13wXz(BHL{{ swT j?l Zxgr]_!7MIz'kI974: ۜM }ah)\ 7 ,d%O] f2wo&m=i'`כI)fRB` L V jU߂8Pxfk)݆r6 g2=pm mn] V[/yZk3liX *0)Q,6W~}' Ljr;a4)svU8mY~XWODMD/i^mTRU^Yooę~E n(dЖ WB)XlreKEEsCE0[crӀ%yk.,U^ܒc] N 3[I\3[IZa _0e `:s6v>FP, *Maq3gFHi3#}fʌ>fό>3gFH3#}fό>3gFH3#}fό>3gFHa9SsΌ>3gFH3#}fό>3gFIxx9SɷPTB]2bGeAQeuothd2Q:QH(E3T9a=(ƹiMaBv+4d I m!M$i )g!)fǙ+p9.VZa}}`& uZ֝Aj\-H7JTi+)#R!J {fDE #gnk3w}y38ϳio]pCrXm2•0Ehd9e*E[gEC%D$?VLZ"$"MĥFT8G"W BJybbU/wRwMK?~m D^yFJ+ ptPZT Ϗ:VSMQ7>+JL˧Rq2G޶P v<4A0&IϦ7^pVW 1\(7)ԅcvD8~},lhrutr VyUsP@&B+Ŵl2km}ȹ,Gg .I /cej7czEgz}LJPK!OnﯺLԋO߮LwF= ndN%p`H OAǗL;Ï RB֩(1 qBmupyam)kmB0!`AX?JYqD@H|:{D yg0Q9 m xT 3}H9%4-(b(#x2a ZcLve3۫<,3=#|oZ1w.X5C!Cz)mٝ҉e d%TO\;`Y:sץ!@6)H9UКȥABY9 H#?j=z3r1}J_܌п!kG3t$\LysH ۋWlKNl#1::d%Hu0.jR5llψ~ Eҹf8Ql9T:uZ6 XǏCqyLoayo?xDb}|y\ r9Pް;<3a{`bD,Bo| "Qomst:m1swk\`ݟx`켔Y}ӿg`tN~G,Sl(4\oژ/dݩhc!j6F;q7vA#2dΦRHC2ndTL1'ɔIft{q/,f~1C^Uxq׶B*$IprU4lل) ,5M}#DXlbaHϢ+=bo|ɋGF'ѓ#㎁,Fڧ[ND3ڳKv.Q;HN7GZ}hM_ߴyKQBwC}e2?vO?V(?4_;]Gohݤi}Ox_FoJ+(J\~.)!W x~O>3v|3-[;0حņnݤOl'Ƀӫ>:lwVWٲwt~Ӥ ";h:aۛ&kr&\^Lz7{ε9TGrawlO+kD}f!nh"}<k?fuɏM8{Rߎ )Ġ82izBZe0fKJ r$q(jZ`4 \t䨸+ @%f0/(Le;8,2Ȳ8mˎŁ¢2$l h&T6*< ڃINYOÛ }N2xiX!lZRhZ^phdEgaqݤĵWӓ}ĵV wv5ۍCLmGjckȠkUEݩ彐 iKFB U IQ.d9(ZeV%6b#s:&2J.z H@B!@n$/HFjFz\Ҍ}TB3bkoTMX]f<lו]^/Sy0οLϗE4m@|LKkn5Q,?ؾ5]D ںIT4 Y046Vll;< A ~yZK݈|<ԮvDnxf) [Gk 0虒b0BDi'| 3X%6UCEsE2q$ȊN,aMRUQA5dTxXMx Ǿ*#qDmRU4Y=jv5 ip*),.C3ZTlcJrdLl310Jp)%-rD|(̳8УuŹcuVӒ=qQʸG\qqk6Z΂ȃ 3d,jDSeiO1p\!y "x\<<GxO@؎2]iv |_q.~D2ڜ&䁣/V a,Ąd{7>2ݨ|S 6*K!Kyl s1*t0GM㞲sdERhMLЧ(QxC L҂(C{+0ǧ9`85F )e* Ϭ&>o-@Y!~B<>.!/lkG%maUKʐV mҴelp\,iAZ8/4lN_דE1KY4Q4 7jvm|n|c@6ߓ_>Li.ɭ(HK"spmG[`ÞRD>CI/<,܏MV=w^(]Waԋq(g(i&=D%fCy=}*KiDV. {t{p{< Z~|鮯ok@Mz,[!u,55+ŗs$Xl[Ҭ;}nzep6갲snmz#9s>,-㭏#tQa^g?N7sMK5׷>uJfe>O}'ʮ^Owk0h:P=4+A[wдEtzCE g1r@ 踤{]3G_cP8HpC@E𴠜]V+OJX1rLq\cGFwCn l8Fw  ^Gv`6(jV)p fKu (AXnRMrN ^Ǽ]vK0mǮp#cC!Άl ac)+=(Y\Rt,SXө x18Ky}s?-4* o(.UwH4柽æm409ٛ`7AIecؒWypݭm,[m[#[ dUWbѩ{Lu`V:\tig(FGτV()58"s&QS R0cb)8kGq_z4>']{gТ|v_^m(oz!`KQRכ}txa&`<& B:K J+- :J|R$\pRbuoSDpάO \{L%^]@u yHƠ&<cW*Kw}UrŁѫ RW`F]ert)c'u2+^][=="DI^++#I֨Q+QPWWWzl\+u0 eϽ2x?jwuWW/P]QRW`)F]er8ur2+^3F]erZ {u7R}wh)?\˼I-R,gELR1{1F{=V Ά٣&[~aqX>a|@o,=v|׫]C˵4l|K7 5nWvo?|s+Hs%FƆTRbMSN{v v.+V惞cw؇ds6Iwe(4Lጒ$HQi 3M &p>4z|9_f/ϗe2|}>_f/ϗe~2=eòJB6Wy9\t`9<Ա4;mJ|QFΧKco9O@ZrDcTwT*+au8.  cw*!F)D; &80uh"T؄ѰpR9*;Ɲ cIRJ!z)VeЉMA) ;Yq׏1[wzvCnKs ~t]vV!҂jNvO*l2>|y0q3pV|uv~lC< LqR{V@i%ցaTH?!r;Q-.~G0FhhҼŬ(&s!cӾr+ C1f}| vZCL*VV** /H C d TTDGٶhrH98,Wl S 꼼aPEԫ!MU;OΫ઱mgU 2-GmaݶuvG3}@KuzD6쪹ِF;{F^_[ML22 [%[U e+ڥͽHd;ԕW65Υ~(n)[lk4tOWOӛN(tfmͪ@|'ˆ[/vS*y݅71{[f~Ȯw7hmކQ!ij)Om: byg3gwu9UZ;T?Li{OǺV3vik8m7z8h7% tچR+- O=O=Gl{ lvx7 YGdV'*pdŬKIKp:!jJXDR.3"gBSRW9sx1& pbsAH;vuFΖ mGkdXL8`юm *6|kQ6 8IsM"8>XH$\KWri4>&I"VGΉyKĐo VHt:$&A*83r+ 9} !!Q(RX%JiyqVBH c ڻ#4oxt0O(4C Kϐ$l` 3,jbO'g4[%P#>׼#[s7O0vlBçn-ڂVY=P-`j4ZR=X~4H;cmrTu&βQr% PRj"Sqn DL!L'47`hRq֎8ĕh}NeFb]*N ~y VQ3| )Ԙq(cebOCs V/fJ -gL[".nD jGPXC^eӳb0$hh '5)s2HZls l/H CsX&AY\+B!࣏YvQ+'SRVG_՝ fM>jc-?XG @h .Bd+fʾqZ>U[c㈺xCY&W?އI߫ΆRC\-h&ywōu4-C(`]JsN&]|9LQ ^ S6濵$r-XOHt6#:tl!Ws 7|HGq8@n;xE?/ŕTIKسߚ*]trr<:^T2eCOˎ6a5LQrٰq sBl *]qEExvMK}Mup n*^σa?{eϮ8Hц?Φ]{9!m5X_e[5᷶UVybpT騜ǾI,?^|wdYr08[kdsFV%\:ɾ8\@h\ %4ĂR^j b7z;sWV><ǞU(JF h$F%1$_d]g}IO[Z =tu lbuXՇvo^ƿqE?5av'b)M{7A/EDCbV8dc QX sL^<.7 S*ܵ{\WSlmR#gc2`U0+ X}bUk&7>Yn7/~<rVZȈ.Um\6Q%C`\[`MRYhmlW;ˊ  "!2 &V2) *r$ BrvnigKXKGQXJwro%4%g`'Enbkߝb~8W{y5Ӈo{MkTn f(q0:q5%|B a$.|8Jp +R}:[}<;sw_/ϿO8=UՂ\N.nеca?-zUEGDSBu7k^]ۼoz\Sk>Ixo+'0J6r lBՁE/ )Xʊj:|d۹Y Pb:ږ*R mXVI[ZTN6ryἛ8G"B ' %Im&X'e LV՚5SZtK' THrz_\XqBrtMx I0Aj AEl9"xml3,ҲQpuӓCLϛfBZx}3N7ZU}4gjk5^kYG|2צbBa ,P"D5֖0@j5x-3y>KQebeM TI㵷Bdꍌyq~XK3NB OW{[l@঴+\Wvŧxf6ūhJv:8"F*p~…%sZU8Bj^FR5 6xX|;H" rZyp1ǂݴTQ;L=1{W QT8Ė^dsR`]`%$  *SC!;4,Ixh)TR W FTv\SM¯:a7qÍS݀q,M?NEㄈ"RP%z׬TD䘘w% 1xQp " p}bZQLpÙR F O^K-o`sЀ|)[ H^u\\ ;Zg7-9ꌋi "l4'*:GgcNUmyK"CU@ GSYvwf<|*B՚>lȣj\~gk{<"n~|G\_7ho/;=T fR* VZZ}D=|ZDG:9Kxqʹ\HBbU Vg طvpT bY{`D/qnU7ɝ'#5m!T93LWPBC!E)Y2C . ?0XKaStΜ҂ .J&bXO̜$PبlsRaݥ{yeB@s (O^#J/޶}ݻn$3"qS8e*q8*4Jgl̾H*Vo&<] L!OCabn]& X@hE5ȡU Y9Z y OMk .xhTBkuw\s@%Z2@TyH 2brtYd68`ġ&R۳;+ (!; x7IwG_** 9ֲ&apA'4V (P,rL;w{ Ml6(fK9z ?Z2zqq"R鯋:ZqŇ1zxxsv΋ZpnΧkfNG̾Y\]d]o1 Nx`.g0Np%̻|^$BI.[(nlaqNCmvq;c,ho4Kw9ʕڳLesyyw-n\z@{V s l +q8_}^eKiZCg w6YUsEW~uQ]bk~(D y|\Vٝd^䞺uSܰ`ZvWj`]ZMױ1!<̫wX\H?#Z߾4n×#f+94۰y_x>=y]v<wO,{D;g.Wӽ\,.whpǽ"9 n@RdL=xwglj&qy؜y|υS|]h7!zmLk7~C}n6!]̎Y[1mTӵpTĕhkʮXLV1i Rjd1Cqaq-ڋ!Ș {A>MYBXYb `so=z_ڐrCKD <`H>%,'m~WƬwZ||uKHY8+VjN9P[GRA풱 XKw:VjL*Y-oXR^j+r59UL.K[iL=^WOU~Nζ>p7v }v7['vRpGok{nFH#\,7QmOy]ko\7+|ݴCX|,& ?>!I2{zFWkU lՇd)^/?utF_/ltvi{/Կ?痡όpt~ldNmV@yw7\?F }sV|c`$ڿ4ҦNkw[on_|-^mk*kpÎu|Oe\uuj{>slKa} :hoJW2{B4~?tgN7z[l\=L?## C_s(3/k@є9 9Tl>v&):tZ_S r7DszczٗO=8B󂣝o>iɧn] ܗ}ܿܮo&::~!׶Y~Z4:yÂU۩ߎ_O~=XLuOY{w} +kum};nV W* 9:e>agt~ik{t?qШd T^Z+*[9hEZ9rbkьeoTޣrdjA5|F@Kj9z6vLQ5mnq*S3FծqcZ(c;Ͻb68fK=٤{;?xgϯ<~w:Ǜ[B}E\뗲&>:keP&@K+] ] ˡ}2 ]E)DW/ϭW棫6о@Iꫡ+z;E>Zt4SsKWOCKui(Y]芄;"Gpw-ZJg ]kbj~)te, / ]").I]%^ ] v1jufj$]kCm4f띟`M6f5"LV|֊<>kA6+9x*p_}qjIr)\B"x *"M3qAtZՀ*Zk]9҂jYsj9":P^BWo<GK1{pCX ]WnjN Up?] n9Rj {OWezt%DWlrRjjj $tЕ}wIybipw? -i(ݞѕ}]YGz"?aĵw{ ׈yY߮~=<:-=`LMut&aLhs8RڝC?]]RW״?qh}v~rz%Lla{'(Wcev=:M><Bڻ88[_kr_.{m*x1Rw#;3 =#PQż\cnfgt<!0/8/vPw#`?[26PЛvjb]M.GH39]wvE'x)͉߻K'YB|.n=m?!;m4nWn\IFBQ8n>eI1qIr ̵z}49eC#Uc]S!R*rʰarT2ƪmT;:[!4_V;29RkPTA*Fr-mڠ5WzNR|6ngH\N>|t I5K3 9Yisd͔yiR=FI!Sm TǘnӈfhE bzݕآZ䔔ջ@x 5َ;5m߭N4YW_%&tڣRP6'Vd !LEt(ιc0 31M~w!0ИUL|/dC7*>#|tBz]֩}&iHGMVT)Lhi/N{ü 9 9X\#x_O͹xޜǬ*ګuZ<RIIbUTw$ף(]@x15wnE=.M:5)%9oxh0)a@zQi mDiGrՐR<@d=Eb3Lf8zm\"FU$X'4͚ BS}h5S4h1ޕ#I 0kjA 1Ȏ ўF }m.5뎼0jGxHHS`[-rNgXc{5PQAm>)h-AGQVaDmUCˮė BLt XBs+):+ Ņ -ƺzkzEK52PU'-& {$yc!sVQmPMAPڑkj &*ZE lɔ2lG?5p\IQCD`X[e Y  ʄ6nXdiu7 VS 4+d8 sze]-r3JlA e :o9Q9q`1g#n nQ!6K7,7)Y_J m*QMe*9oLv/ōbUPP;3+E4GRF=kyGD)J6NBڌMu2R*LR!YUDI)bu+A[32arT[ikMΓFjeR"2%9&dY2."2Dbn iDjE}B\FФi!:380(crn3RUNJYɉFŘ@QUIkB9}!v3u9t6俸0`ϋZRtr`c{^6u۠mz`-$ >:%@uPI[Ptd)PCҕj@h2BCTF#4'z9K.g1q\7*J`0uƝ %f=†`6}˿OߝTɌ4QG[S1Yk Q^󤑲zhPAiQBϟOGзUfĞL:5n(!/[t+9]PnDV}q(=:)JjN `,J`QZ 5%fdiB)w˨zƮ0XB|?.w`E^1H"NtvmQ'7 d!g1#Ϻ|AW F-l 1,*1"@2Qbw<A lU?<6*R-1(ƀ$@wD37+;HF5kփ*M3| ڴ̤y!Z CYGy~5k7!|ƃ7 [cUw&*އY3*mP zxq Vr&-[(-8m4EO(ɌBj4CS3AOv=G45(=)PCXm(6tqHZ5ێ'E4X!5fSM*wY. q@C, cV4KPlr)x* 4M](h%a4FΎ+Aygkp N)gAA56BpO擓k״Vk 5 -%cR]m9}2j]Qk[[#`d;4;xx{ŝgn=>Yxs/.yO 6Kr4Z e zN ('8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q]'P@;q.8;YqE'H@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 :O- 4Z^h7 4P/N7 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zN /zg͋@r@zh8ƞwm&-: 2'N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'qqk}aogtz]\PZ=_O/0@K2. Uz9%q1%.`\k-q+6.zCʱ^]p.pBWYl ]^-8娫.PC}+g{6ƶ:j2m ʚ-Y5h'Nfj d=qVa~3p˓yjR͗I<|_pwP*/ʓC Zq  x9_~HqUwέWֹux"St|H,S])wSzU,7/?3>j9ˏG5Eur~Fl)o^#mQbdkwʏ86FռȕdUVy(tqAF`)#2nYk^9UD>2fAI+$4 3r5PQO-J+B p5:3(p*BĎ+T:#5?2J*GɻZP$+TTZyp\YeSppdpr+Tό*[kUpҲ5m P~ U W^ /(%wANr% PmQN4X`\nUZT{B(pp5=~0@ PTpjm!CՓJ!#+l\\OWV=~q*U+R*-]3v@1o<7 ޷+F+aZItJLZc44&aVx`>%?u @-l+T+me{[hY2BQ{W2 WO+cQq]\M&DƎ+T+k1'!\`-%\\ͩ ;PK+&+l$v4\Z=PIUq@Ղ P%3j~Z=>3=X&?^ǫᅊ`B`HarxVg(Qui3Yʽ$瓼(Km:]|.8!t܃=ϸmq vyЃa~IWìYi)ߧ[ir=n{2ٵ 34؎Wwv}mSqME7WyY ʇˣXv0MhQ nsC:>/8WŪX'ܽ7n&7f?u8z=ޕx#&Lj,2ߍ, z"˭E~8~ƶ^ndd.ҞxY.T1>]aˆs4MG5UFR"q Yjyks9-gɜ{k#ebdԜ|o6ViA#ׂd#tJNfs/]@2-]坈P0uy0Ŏ+TxUqewJ\ũ ՚O@iah/qe2R0dpr\Zobt$\WchBBg8 +TIlHvk8W1pC\ !p]' Ŏ+PK!pFRs jKWq@knUphmPi7wJr{w^nPwd0| Q0cUZ* !\Xq,\\Ω ;PelYWFZ!\`OW(K*Bǎ+Tj WUE<6Lp[r v;LL+WPdpr=+P+W2b%2QFz2 "z\Ji ܎w[EW(wajcϴ+TiYUq%&Xh:udpjeG U> Z'PadpuTpjWҫ>Jke]˸~7B1g8+ieы0kQ`a#Z*\^R[ fP SMZB$kIe}kDHu(ʆu`0#sE/kP9RS G$NfjA ΄2jm3G%M-<ԂJ(J/nP3 rTpjMB.᪏2 [KW(3*q*[++VG?숸B̢c+9c{WRe1}ĕShb5dATMZKǺO&5dATu`hz>`K=;1*jz3G'\kzLQ tpŕ~|] J;P W=ĕPkCWdpr\Zb"᪇Zp/  MW(א{W^hT3? >pj.@ (n^J,P7f8+g>?Ͼ|$[o%yyRO˗?w){НmP|1*Lc^.&۽u+Tl :KexE?^IQ@tprA6z\J\cqe0ڑu @c.v\%\= R A4t\o *)c4ɻ#P AtAW#Ί6M,L[e%#+#++]Z}0*=OzĎfݣuL ajeG S#;GJ$\kzxe K *FWT:pC\ +}0Sp\JpG\㭤ȑ$ *2a1q 3Gki)bĎg7~ d iɲ13gkS2'HA΄mmE߈C!![[m%o]V";}w q.+,k^c8Cۀxro^2 gjz|si}[@c>݄~j8tU~Ɔu_^rtG]wIsyƒ0UӫmJ@wmn޿{Uľj aqQoqJRm57uR7}jsr1j,EY 9V1+eE =* ]Vnlt%KW•otNΠnnEqtbYU}y:/K5`9X?Bh3V)+xh=fRNxίݮJKat!1q;ZU)ŸP(ŧbZX?=~V0Fzھ4 Cs n8}ziO/f3 n>їh_kY->ݴ}Nȱ @m+ne8ͧYox{w/>)˺gdyHڌe ]>y^q>*r MX,'?_&b:R ^ȏc>ҕ~TɪIӦGz4V5=;{λg}}snoKb Gۿzs<|9 }3[A :%HY0zb:Px`M9PYxr}a|_Η0 Co<FKΖxtپc;f4G{ X~%q?֫](aS*kBm/[7g0Q:=: s؉7 2&MYT9!™\9Y < !E+z)|ꍯ?N_vcuQ֢XQ#Jeͭ˲q$Bܗv!^M60bLQ(vW=|%M$^f8]3SU둚WQ{"j9KQۏ {OK JmV?vCyj0**bC+Rm.:+A1Xdq$ߣ8m$Bwө-%S"it,ց35B^G˻JE!#3$mt Su@PYd`)Bt>a .M&Ѵd^dlI'\y# r\tVI̚3?yl3[C뿛/_W'o_my-3yY]r(DaY%ٺAZ4(F"t 9y5]_Νi{M5PZ,&NbX䝕ҲEP`C ZEs?+< >{4 NXs^ok N"gr, f5rvIC&㔀AEk05=Դ!ApF :xP9oK?"b$Rua0cu<OR}@I+ %lY0+.JlєbDZBlyy.ҺeOdzӽJ<8g~<ϿZ'O-$σ ʪGft5:gPKb_2ϲĐՓ9.Br4ذ:O'zv3l|:G]Qޫ¬F?--Y}r~* ~%O|E㢆u󋍧ã48euOww}oT~Y(pU`0dyCkjho948gh|q][2]qZͬ<÷,.&X-7s<՚2HQ^~f3?[5 IKRTκKgw!a0`Χ+<ӷY1䭲6,'yRB1*x*MBZ SA0vqIOpXߏ1ILn ?s]'򌪙Hm,޸~%Pd#@ʔH'EIdZR@ Q04YיշOߤkhb bHXFtgb^9z==duyڝsk:X5}f]]az_s~oэvMV 1 bj`b᭐>Gcv[K 2+P6_L0oFٔFwJPQY^့ِXE/ÏiM!a|u n{̃`B6ًqpU~qnפv4AB1W=TƒC!HY#$| .͞䅫kӥ}J^n2Zoy& 9 7*fB)}","d"CY7w|EP̺9 Y\S@-HƢ#(L&C$ RMzvQR& K4,+eF :顕8C)e4e\hk8{-wio/8vrK:)~V5.Y  @т*E8*6e%ӆldR&\~6.b,xW |XT:(' 6* 7JЂ"jP6 J@ZCBT^jx/v(\qDCc;k&Ξvv|at;K6xPLY1%d6JJR H3&%#P4ijO(%Zo_/~_Tf?])&B*Dȸ 倬̠0&_At5x$kI U)"i:죈I RJYkDAAF/ 8f:񅔔lxiZOhP@KKa I∑23KQ2$"V1srl6<=;k"66e>ÜX?H1`BD TJ) F$364cfru֡ N)L^t?[%i@&,dgJ6R6..]qaKٸJ~BRa¯qB-޴߸p9,[ļNwUjAb|pCcj>F}6R}PߣoJT{𝵯6h io' .LgbR̵>$lR{ ۔.K`#,ɢR>2䄩8fR!ؘ6gdC:jhܶWQWuy~Kgdrٶ&ү(݃&[v@ZτP ǏC/ͺ0BNV)!] QuLED?@@,T''d'jlHR0NΚlJHڧ@['3m+}w1LzK;of^\ԘՎh֌r~/M'<:Ϳt!g0t͗(gWR4Vu2tKa/auuyt};RU ۳sY',2Ul8Y (V\h+)0D袜pIOx3Kٯ IUD %Pbj1A-?l g.° dc*xdLQ@V8\ YԽT!$k2*EОMF=zUw)<(>.{-wJ2F6@sLZRVG K?4\~)_P%X7ŶMW=&d/d% FM06U>N8_zB#QaZF~ϧybnzmkkvjXuZ)^ hk( (`g- cdUv6ml`YDfȌ$2N@}M&"RU!梙\2&Wa3q=_3Ux,~]c8x]$S(= Gܕ5Hir;)ب1i IU~&!#-PB JL*ے7~UGna\l%EwQcsVx2h+8)BD]koHv+D.u]d6.6 vvdѨm,ܓN[$%nzj61vX*V=uo>R2NI`},^Xo$: )P;)x7YZ=æNВbo=r| >bUp ̾Lˍ|*JX9՗.B>M(0Je|ZEƐ7xӖ$c/]On8M&=U)!AN^D0JbHEΑ$LR1*j1 ňMzgsp=jSz@+ܸiP=&j<E[{Mr01;G-(4'Zp?c%.h(3Rid0B^+6!b>/t)d d՚ )g\2k@qC*& E\S&(Ũ@dT KQ(M] wg,u.3~)/Seޣ*w`v|7Yղ35\1E{#K|dN'Ozmo1x9"QIҕM.y|gyQ"Bsc" [NwRp> }>V-Zdv_ !+C@umݸa@v/uAYF۫H.fs\r))HUSDVH%`RPj \U6IjPv5O隦/.BP&/g Xeϟ⟾f YHmYW'jTy56|I1te#!E@~9Kuǣ4݁D5k; 9 A(qZKefvNZo ~=ݏÈmD,C#tIm rc~mqگֆ=<3mqW٠C9POousID1Z>.o|l6{|i`4<1&l2@M{GPĺ3lަǼ.el\Ogn2V]v=h03.6Ƴcn[-F 9dB[ID+StH!q" q)׹dXjr<(}3!|:zq7@<@ SnLDLARr̂NFAJ)Á:!}*qyח8ƱzxOrv4Pߞ'g6)}믑CVӨ '{K"R$V:WSџj*Gair =nM9:樕VAk㓥L0Y@Qҵ"!)<:>F$/sju v=y7CF7Ze]K'EQͲKQ 啤щVQdnNWbgkHn DŽuj6T5jӺG\F FW?T+sRgYi\;E犯FTn]v]|p(w'7?e'y|\N? :1UΠ4pP(2I"\\HU#8kVfɻd{ƂEAt]1m4}l57$z^[YↆhBo6r.Ze\`5QE(O5^ ~sQs=8>Q lD#~px_j,.~ylxGO$G󎎉HSJ/ytIV/ ):;r8m9ݟcc"NQsItbIҧ8)u.ˀyk#PϺMI2r IMQ'AZ4q$&<#1PJ E&\Xcgdޓ1#Acgp-ݩoml1Ow^9J23FeFmP`3SP)'RBۧ^8Ļ MR`cS|sƙsgEe")@E@ wg#oUQR#)'<:So4 " oЊ(BU )j WIAeoBrEmeb Gb΃N:MUZ ?".Vdh 6:#v PP-W(HQz$c((rjC"9,,]R9'A?y2:!yeG26>;oUZNq.ڴAۋ󼽐`jR*^.Br9\&CмvL4Ғ3Jxsr+Ʋ[=>k?z\>tpͅz-i-)ipBiתqzDi߮C=~*Bw28-."XyK/Axtozn><:mqط%2xξ_/Zұ_'{յPjQl"V:TgR'S:I\S8{u0U,:ЩD\ eJ)pU,KbRuסS.jW>,%W %&u*έȜ 6Φ%N̢I*%OySewFղ޹  _G"Dǽ3b'B=PNʅI'Sp) Zf'~/`Уn}(zGC0W{[aARRh K#P|9 hr 6;v,igr 6;2hښ"}6JQ*d*yb n ުg9fB6Xz~<R]1Zg,DtoYOE]t`9<Ա4;m\Q3wHFq`9K}rQ#QyfN;s^yfYg 1k6K!HT4\d8@AA%`)M6,TʎtPIRJ!"EڗA'j4H::p1*L^\ _mR7fgw_??W_Biyvwֳܯ6=s߿țSUgpYELO6O) N㨗 xAu pUUmu/B9Z*~Aec K/f5X2A^yl롚JҗcYO:H?VYa@|7YK57+n#I{lP뼽&=M}P_>~XvT3)S?A#v\8MX|] V1 w_w.Vܒ=:Qj.Q~*к$[Ϋ֔g:;k<,cFWlNbˢ! ߧWKYȭ˺u%X :9>7и2Dr\=QN{K4~5__qsp{b~&h4WjoAIR-ǓqWCuq3MEi"-àEۅ!G) 5f6ZhAM! #_gc"G1QQeVFw2Qfb֥%8nRс!jJXDҘܥ.k6co7iI>.2B26\Z#iҩVGΉyKĐ!:«L38 H_uanI Dso4T鼭0 P(BiFa9YV^uTk0$ C5'INX;I@RM&{BhI05-O&xY͞tCg o^Iax0w9V9vV BP +d46jkNaMlV?]iB ӡUF+D*|7HWBDW ]e2Z}rHW+Գ7lMW8vuZCW{AWb#];9 ʀ ]!\ j(tbUF)HWo(|Ht]e 2ZaNW ]1MDW15p`V^(h Ijb3̈ x,岊^]?l8nTN, ZR^q)%aWo?ϖuӀ9i#JtA%#\7BʇoܜrBh:V|04j>F@hi:>iA%5l@t W 2S/+D7Gz3td]ej8gv̙]Fkzo#Jl7HWQ.d "`:-s< %~?HWN=\H5 BUP*=W&Pj:[+'C+p*Wփ+9gږ8 dT]2v؎XM0A uٯ" 8-$@r03}*:UOW@s~0އ]qogځDxno~M`Pc(xPK:GzˇgMI؍;;BO#x;O'ILwNwMy5)%r|g\9XAk?}~p ~7w6aw?wnwqCGW9J?#GS[2Yun-}zٰ<>U -; 233p@Oڋ}i]̔ 5' b][}slP$k ޴ &dr$gUn:[BҮd an+;">x}=o/_&S#ͧ~KətqtW#гn*T/N-iF;sgtP&L:4j^ u*֪TuʰarTkUu<0v:ufcHӝОuV򭫊PcnՏ4ߎI]efj͉!f-Š8oeIKI`c@.wޟBRR̥fcJk婚 5 %S)5z-pO}F,=:d0vcF4Ccv..ol.HSZ䔔ջh`H~ekuɺz(smw ޢlN,h2ѡL2:玡(i !ƬbjO( QaFCu"=ߥ$&C*hi/N{ü 9 9S,Y|}!s"!1NTRYTC*)IW6՝s2Jy:>Ò|XMѵ5b',"8:AOkc0m€ڈҒ$%$X']4/[ OA!Օkʧhlpn}E+ܔ,3HTg*Ht=) ;*D{j%x!Ԭ;v4F%|[(_ \$0`gXc{5PQAm>)h-a]K68E @R*C!VeWGWb$Xtytp Օ9[ɺ( Ņ%̆duc]=^џ^-FU-evyIi٘uȆU`T!ThS%vdZ[ElJ=R{A1%vx X* kRTL `6+(\eI:. d> A\R}S:n3Jt*T[!z@g3 d2emB AvEㄨqJ֕!ՠPwVrE`7(c)(l Dak!$ ( ""*NuꎈQEgJ7ukƃ AgbΪa ^4 ?%0Rj>hB)Pu~^2AA.:2];F*ACM5JAQ"Ł.0͑fA(xZgIywD"=dH_(Wm ()eXVW.#{,uj{QQR@}JuzFyո7 !ѿ)9o5R.8(Yl,@E@H 4vM4y V*КeM23r>_nbF\c)"9YhQ1&PTؼ*Ca:iB1"'}8L0|>~l9^ׂ7[W f=fg˶t4MG1*4Ge:)HCҕjɂJUe ,#(yơC /Xżcx¶VA;>v8Cv{y'KS}Eʘ`d̾q6A@F4=R#ϷA[yh"!%:]%@_>1g۹D(ʠv0F/%8]S`iyQ 1blGj31j 3$2n@Gjc~,A,J35VEٕ P?AjDPqȱZ5 2ߢn$F ny V,Ax0ĵU 9<cs}7}rq<_ui/7sݨL[Lj`ҍ;Yhli#l F H(JfY:Z ݚZSP'ՓFCoׄ bLP^$hp}EeFĪøm"_5%2`CrX)lz@'3Uw_.:)Jjn 2TJ`QLYAz|!( ==`U߲Yo<+TOXW TN1֮ ur3pr)ћDyqaRO(eQARnĨF1 u01x>yT],~ xCڨlJ%cT-،znzgm͊xɨzPiU{/A64X*8#TK.dP?mN^y~{PnB4?jx\z-zC6(=<VAPU9PV'zds!}~)AHgk|T= zV7PCXe(6tqH4JCyC`h̦ri{4 J.ȎYyЬdKPdpMԅ.q$@BrۨBxb!(9L0H5wKB违Zܼ\9E.ڤ'B6}1CrɘTK~Cq|2j٨s6N ۡwNޮ?/z΃8By#\Q*'9j@@"Pqy:B&N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'П !hEN a68xN ]"Ngt'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q=['"!&'D q q-wzN  $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zN k1qG8O8x'~ @H%JH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 |@/&>ޛt,5.ow V/=L 씹u<>@.i͛_u1sz9i}y_j̶,ixw/nuHr\.~i/x._+He8 F;3mN'W}DՇbdzԉ=eԮ9؜6_O3Q/ֽcvO;#JXh.~;-oo.i~tyGdRҞ0><݋qǮ&$?t /zWRtjmSUT|S9/>2kFϹċ=B/2{/ʌP|4f?&0V֨C%P Pt! GWCW.@ܡ@3++"٬\BW~r?] ]=C,pjA2+tQ ]=C ;oVDWmH_!咻%//El!@K$+HH9._ՐEӥ@ q8zz.AAubZWLSU+uU%uutEQ].j+EWLrStltw +k`pMwSpyfuDCCcWIgf']=69A'Nי EWD]WDU,jq.A"`(iH6!fem G9b\RtEUiRtu]9f,0^IR./CqOUn%L "kLI%1wPRm(i^=dR=uwM3 E-42+4fGNy)b옲誕r銀sbtŸALhAe?fǔV]PW>*$uXPgpQRtŴ6+F тׂtE7Km:]m^WD]PW1Rg [Ӻb\'(6;2]QWQiI8h1bRtEZe+Ա*WdUph3͟F DDtfKcbcCQ\+6/N^6f+%h): zZ$P׈i]1-uŔ.]PW_B Z ck'hI*ކ澡~qCuJv1y["Jd%JHr}XDYNvO4!`ICy_(B-ij1Ѿ@:+Pʍ18%hhщZ ܨ=t~2Ieh(C N` ]1p㩪pI\).j(JzQURtEii Ij]10F1"\]1 )s(i#HWV,>1+=uŔX:mjM+F+g:#EWL 2%z+^I7L 5ITʨMfK0AWXtSHL4JjO4誅2,A2xI+ZZ1Au:w]1en+PT_D36;ԞH]%Rf+PjulcBA q+27EWѕqO.^+FPrSfjqteyw+$IjNsM/ϗ\Y%^WW"ovs?Tڡ7sJDїyGV뿝.Z/oZ";m"vYC֟PǕy6;yuI#0YR^f) #It+3V.\~0S&oXlzIQR%9fKЎWQl=}Gw8M~0Gobغr2/qTp۪ZڛL^ݹn 2S@\LO}Ыoos{+J^ׯn֏XT'yIw^^~Փ Eϯ˷]]Tw_;<_Ňݿ3__nf3:E_vr}xrexNߣ ;m 3C1 3mҫ2HAzt5||ݯ[7$N~0@EKuR׎hb@}68wWr9RܰPᐵr8@qou;VC듯~nD.6ת*Zo4V*S;&VkRCjbTZYZ>_{ƢEe> M|CtV$?xza ƳZLEePb 1*qS#ȳ)ݕwNjb@ *i1bEu/ JZbj%냗l =Ay1xwy bzf# y[Qsuw{ bzpEw̄oV7P&>T_ny6JdaPZ\r}iQ_}IbԦ&{krۯkSP>T!(^}nҦsz=FhmLh8cO$v#i]j]SXwu."RkC詤́*~uJͻx4ۣ6pD˙¸AALAd+}B0X&sWɪ /zT,EυAF=4OgӒ qs 0y+yx"Z="`1>5Vʳi!Kte f}$P%mBSܟJW+GWLkօ)sےj]CokиR)M* 6%4*eiMЕ-zl ]pPQhF}U ue| Ln|$\P֚uŔu+u}l% CtO[0@!k.X$;(ۂ4hqi[M]D)n3o芀3fǸH]WLXtB]=:A"]1.)"o]%jWtB]P7ob\ki}cvLKg B4vEj1b\RtŴrSF1w}uEނ]1)bڨrSboF]!ghVtEZ1oeߺb2 v =:k͟ fi|$ZgZ|F|3H]=6#*+n|[4\)҈1SWtB]YeutpRtŴsSֺEW-ҕsqO*W&rR P۬Z-|c6`;)ِ}r$$8=zfjz`դAo)T(A<)J'3]dCSճ֎z0Dgt26pDŽƄČWBGہkWJ4ml\&Cb -}h)=-?ʒ'#ѕW^yqôQ6-(rՀCop_-ϯ:}Y IΫv㋛՜G=T~Kscn4:%ę}PXx{| Qb^L'!EDzuÙ&UuN痝WVo{Q'ݭYwzا ZղE%eu0ڽ, ?9}6Ů"8LO7diƧg?c V.W~[1yx~y9^Txt|%ez}~*_!}%udfUޏ_~~|7ӿѣn@}ٓ^rxu*q{$O1UX{׶9:n cYf"f9PPY8UeOҖaֳ+%O:Ii_;$m%H^,A0HP9sH1ž~?SW2ۿM&=nߕ:V.`mNqm,d \]4/|#D1Nūe,/!Vzz#!JOXOT"hD~$xOWsPQ]NBGe)["A9U޺<(a@FFy/h<ˇJ yQP?81&ј T`vݩO~OEX?^#|Ta6YݼyH7ѿ&_C"łjĔkKT'z? .S(Cn~p,'~b848RPG"0GXm(`( '|&]zLF9Y sTsC6LVXQV4 ,J,Q5͵_p, ɏ.9AcqǡČqbV/.S=r!gs1W?&ZLbX#Ϡ1P+ dGmH(.LW ر‡Xg}vN O9 HV*f 59jDb Ib?8g{ $qɘ4zn"BZH/T(R\^b4z zJ˓]ܺLpѺ\bl09Lppƅ؛Be9Kc&3w3%eٱTՅϿoWE2Tl[_! /B!HEఓ(Q v&jsI/b4ф>)U,9hLtјDxqY1cBŏcj"gSC}ZG%eyœ47ےsʽؖ_$^fGHw/ 棵\'>{@QËBa$3(; $_%haʇ!W\XhL*hLB t @,fwpt> G1~{JG9AC;/bē9" 0JGcHט9xqV10ylk <̳zډtИ՝\(l<7>rJ  2rM("3D8+F5FdEGkGW+֎r*|5hO zb|*$CXc}Ԛ.1{qVqh܋ZL2e.1%0^>"VRj9@XZQ0N$"VXhY ȺN 1BRE `V$lkLy1jVM(aRͻn.u~_1*\?b1}3K?D\\]4~Qkbfi~_ X)3ɥYH'7ֆ"PO]fw,p#TxATgh{]V?^QzʡKFwþ{ O*.T;(."+B#) x"a߷%_XX6[\5|.:&_Zrv 1ד˹mi>Wمo@)*g_BX9VZEڟ{\ 3JJ/\܅-sK 7nvea|K\~[QF&hJoAyg"6(d=ȐDt8@V-kݯM-!w,@RLڠg`8&w}~@6Y͂Ѭ0ԟ-\!$ .q'iN4'r 5cWAbAt+>j'ːsEr3 =yέo{_"_ԮކYi4M-7!mo&>dU׵YJ>grCvXPxb;1|Z Z ؐx#CtbAHV"<,B $RuSψAPIק2hҹDqmhܷ:pT#v8@w{Y H>ŠĆZH[)Рk-:7dkKZu-E?Jj;g Zc~m\ 42H8Dn#'WjQf粝gUP(EV`xG4(mL)QЭt&EDWӖS>xlӺz#er;/"Y9 X:$ĥjn( P k0ۥ ;! w\-Jڸ.Ly֒$_} 9WK=w]o+4B`L٥׻]~&*ѵی- %o:Z\ VWH7Ќ@8u5ʭ>oDY'V(ASܗu`zo?EH|dm Y[ͳZZ f87z+BoA842<(mWo\H(!hou>Sxǀ;S'D*֑+z:sB;^/|W637E^\Q#7ihk' Hz) ٍ}[vq&uST{݅Sm.rq|qg]Ֆc:Hcy%-43/P}atp*~{-}q*h&|2 .K|Ro/&ŏwO8ժԠmSm;Nj7y7zWrvۻr2Z7RL0-E.) MjE58Mt}/>^8wUK\^WaXEf*Ulq_U + yYnqL{F-Cg+9x/vpz:lwYSwldK=`FQš@,&QBLMBe/8d3j,uyQrN!<:CpsBiba5BI!|c\she]}J]\/Cs|qw`T\"Fh0 Qh)jxdži۱/(7{Ag&);ߜBv dvC V9ZZHS,N)B4/4k٥)S#ۜ e3d~q2S=e$`nϋYO_8As9ixǹ{\ " tfr-L{=񔶋C#//y;n7=L-y .5Mw&pM3bF>trԧ,UIfTQqe0AaNR ? zϺtqNnAjv ) 1>2̩Ϛ[\.oxǙ{}km OG׻u[굽S3 ]\|_s%|AVt q:ZgȯtL8j11K ;=ka{s/ lH:h_&}/'|-\b_,V>{9d;ER2Pyc/GwbwHϼ; Ur63>&nއ\fp 2.زg#죑ߦ,JnT#ԃ) vCՖsu- Z+an/"Jq$YGΨ[~Ҏf향l$hu޿~؉KCpmULt?ݚR3OaWNtl8Ǣ!5߂Ε.}׷_+}zYLQ!@;*}Βl@ ^\0]w:9 ^OaЏ7-y"cK$^F:7 5fiL{H-z00uT-H"I9:C  iv f]kYeѴMyOW507w-ywB;܅PL6-lGgtۯPY r. (Xڴ(5%!+!oVoZ ʓȠwzέq~B aU}?OC_qėzCV')nw?#tʍJBQaC^zOlH4Z뚃]/_ˆ#=M6gXun^!lq]%UU \(/AI"]kY+~h3{/Kbb%*KV!QTi:Ys]b޶`V뻮 J!<,U.֏ݖU3 L[ѳy\~uuQ}jz1&#FSx(+;(_,f0 >d9Ƙ2dɌ6eBZuDY^jB?lɩR aXPY.Tshqwø%0}D6yxsRfH3&sCBB#ۢByQ&WfZI]?{Wnq5!An&w4.U۲$Krۿ~˒Jb%%]ɯH;o{ n"D"UR0:u\񫐘{-cdwx͟@ 8F߼sp>v;+XX V%W&_Sd?)ϭ5L 㑮."γtUBSfYzwN:iJv]եM[;ؤ9R\p%ݔ9gֵݕ^P (VIuvP+8Q:k-{s"L-vhNs$fc?8sfkM / q}J89DEκ^]|Zdhч9Ɏ|eďiI *kR  '?eO/En%GlF*; I(Mos6,EY0Nw%\`d_[[csGI;< iwP`X QYCZJT Ѯ s*(ىbuݜ<3EWACa  w{7.v(n@w8i YGO@O 634)*)pBuBC|S$qva}+O|6wM&霛S42iy7d:piJ@ HeVQ$9 c,cE(ѯXpqn'9LiG t 4L^eܾÏ!mM1}2"&)ͯOgDKH6jDS7Ԗ* S$jn`ê(ay3pmփlYF;z{kG%ډ]/*8{2 &]52qK;T7`D["ju PF2]MƇiFc7WwMILRwJCMmI3"hG 8j d)R[ GXUjY=4p#z Þ푡 E2T#c<[LrH{+yݪMTDAyXi DI.k+, =&F}!/!Zz_ITǵoއocD;{QOm$ [xCz/m΢g p__[R2]QE ՋtNX;jℲӒzWG(bh}B\)xIJCW{Bؠ28Wpƒ m )W}JpF l8_8#$tvd4:L5Edx7!h?TXWh"9"%v+bQewdR媎Q;qf/'p>Yg&Ǥō߹u,Qrj?%`iхGبI/ E%/8ƁVSf P4"t~ g#E1N}R<$|| qHt)=F`%8󔖨FF<ZRzmJxzÂ@*4JZx Z@<t.ŀesD!<߫'2A~9iȠ*p z@ihD7>)kTj3 yMVJ%{gVYtBb3cM ־ŽR Ye4i˙"k)<N$jPX:L"K| h(5(}mh|NJfkq][T4([;ΗKaUcUIvR MObHFSj=ݳ-vFUXG{vuԑ惾R-͟EҘ }8'EugK0O]|6xx ť"El֚ vYu2X,]4%E^[xSu,0L8OIyi $_1[+9& Hp5FmfpC[* ȌJ0x<+d⊆a2S"ѵߣ @AgkDW-})" bh ;ѶǨcari7InΛa6`67!]ALra> QMƤZVީqzxjS1ڎ))WGr U2kEJ37b\w63# D=*j!QxKjƆ+N+ Qvġ86iVO\^EBq't#h/~Wo GbD!}6Mr{GSLvkڛ9izaR{zv%%$̘I v˘9`JT|Y5`O[qk 8jd8u1N7̜T^`1vbVpZ9JuՑlO]d'&BfR=D  3J< %{_=gUxl><׼#o Qבw]Άe˱2w4bLW6+fN%)} fK[#㒥 <0{=Qh-JX6[=G[ow{яec:ٰDʈHBӕ\+[U/%,7h'ݧ52z,y'XzZq1I(,IA-'D&}<_8BŪ,m0R86_[[ [i>Mz(C0yW>mYg0Wߟo~ݙ<)餫5pVե}]?U4%QW6Ri` ] {p;ۿv}6HeY%󴺛Mf7{^oi\1}߯ }aWjZMcSg+A `J(Nn-Tf Gl_!+={oٌK_DT*~t9_.i`Lׅo,7y8G T4Ƃ`]*_Z(浣 ۏڮԽg]5K2ר%Ο?g_㷫iዿos_o+s& qw7o>շ@hν^!~oΘ>B4/Vu埫QXgFϫcּۨ,2黧,˪Rj>wa< R "(YgoNrrހD3<0$>Ou+>snJeOVk}jӥ灪ɠM3m.'n\/9T!"H-Pgޔ2sO)4M#E¬,y1O^4 -ȡT`zEy[IxOe¼F0uF`&ĉ,+pZmPJ+)탹d'H '{倥hבPp-N"Ҵ4 59]l @ Ywa&SiÒVhPK֣X-UJ)X+ʣVO8*IP66BO~Gl+).4~"e<:d o3D!%:|Th)op!` q,w@av\G?E2Pr:&hRB)ޢ>c+XVD=׍cg䠔(6cyB@D+*+Wg`2Ai Y┲˓ KQjNP1TTꚺmDFMwN1{PHGh9.m1}jZne^ڣv͡PYF?T\=?niܕ'Ň0͜+N6b6,$4~&k01lX"eD8SBJ..E>&Ƀڛ|Ik[< rr&miˬy?}ڸ͟V7s7USi3W 1/E X$adzTH'hCVe7$,B&e9fP,6KxsD!|?2TF/7n$!x6a7hӯtV3P΄p;ˑ+3෌S>W^$jG_a5isҗhο7ySלt^B|Iǯ;Yp +)95L˞:6[5I$J`1."V@Ӯz4l5m sI&Q+i-Y#lyѓ3"%xО *W_Zq,{RF=:NO G1g,Ƞ`Ƅk/ȞM*HMdsNQt(CПtvqSd:EIbXF1ؐ{Bؠ28Wpƒ m vka ι1\ zwW S# > * Id[KH^vv2.YW㠴=lP (\%,t`uj!a[PEA7w.G'ܙt2_r>W5/OoYH?4TZ-t|wμY|5F8 U\9LNC1q[#ES.f#Uh整(脳ā.yL 6/@;j7J r?M̱@Y2v ?{F_ØGp[X s7AٶbYv^?R%%l{dKbKaYbKf{C1@1< I d Iں)%qԌ#]^#v^'c)K9fsu[_r#m\JMP)˥' "DD H)!{VZ>;)s;˭4vhu`Q ۶DT5OlEAwLUOY1QC|po:TkiDD 0N<4M5("U9&YK!=Wyhy"7 Q݈m193&|笹7SIOb^þXt+@@ÜS#8 h5MK9c088-옮nI=8Qj7a_Uo.8'ق*?@[Eݯ뿹Fkq@ELLMIHޣʣհ@ID ?^l) OP6"-2HszVu /Ԫk [rA-{&)nѷwi>`Buo<%(Ήr\oT;"AZї<78H$Jd,t<cQ Ey+O$;Oc~1h]<Ӥ5)C@ԝ]A5H*gP?. g^(lB\WEA=W~+_.3OI7[a/nT|r |uQއOŗ.nzb?#OS7!BDEG߯xkF0agwcP`XKc˭W/SBJUfu=6`dċÀ0 y=.r+; (6\:L< $$^̙U bcŀxVGQXA-w.B;8PqnMZ8O*#R Vep7"UęFRYpU]u!NB5"v1xwvF)8e ڌR901WJν|-; `& h~9̮f_a.!iAF0 ,)RJ$hݯ}oF&8g;"7"\ZU/ (3'P 7Ew2T[ya-[,hF`}yTPJI0U.Py&J!"A/_]p;Ll_ž*eTiF񬈊JN1֨Pv  & ! 0?`a 7J%ߌ\ 7]pe7]pe=ܴ\N`.!bPT:-VeXi4b.qJ/|_ìA9g'?0Ta60k2`>2p?Ws/k@ H~~ {u+ƱMBdrPUY}m3Cxw$Y} _ 2I&`)<-=?5iMrsCImV-0%CD/O*8;{~ۛkѢ, 0J B .vƪUXΆ2! \ߪ/X;7\@ݮDޘM"~ @c sYʼnѮd cE }<%+!b42{|bH&xf}lDB&TEeE)i(onq!Nk`lKO 'ƣ.f@(`+>-kE3pڙ$lS)ׂq[EsWbI:G6s˵cuj{7G)$ȭGaJ[[Mt!zu/THGHiE{>a @ّ~ːu wmmYz̖U?d`bl֨Ĉhd{&D8u˹~BݖyLmK:]*uY5Qx˳'HA3ENYXzXUx{׉ƻ#Ri3d?bpR_1ϩJעy -Ej*82&T5F"xAIZZK'9\J2~jmˢ}SIʯ.夆XĠ`TBAZ#&BÑya!@Pਘ:(ΐt ҉Ql*NiV8lPE8^*JZCvHuü2.H"'\!P;ִ SJ5V#LVb9%N,ZPN2 M:0RTh`1!OzbuW}~ &OFa>O ~yS-_yǣђ3cr3ڌT,?vLU Q],CVH0. /H0! m+Si9*%!;^a>n}ĊaV Nv!T:$G`;p"z+N%m};(/.K`ƕ,FRw ewlq~NmUB/HaԬ/kU: /Ke Cn> ]+if)nk7-郘yۭBƑƟЕ곇1paְ~R!q=MKM!eb堰G LN~_49K\SU?NR9($#:L.Ld@S0A7Rm`3!\hq̆/aaQ~6{ `8^(q|0q~j'”ˎ܂ǵB0K1Qu:ER$ܳZgJo'q?6%4o=EFu}Jc'f{ktuIB ) 1A1eҽ:>J|ߊ>3MJalQ*k3'X,um9Xs&Ampں-[gyIm=:^JjWd%ZPS-3V,kD008R4Nf_e?*>Rh}v$gjWsf>!3֡zI&\0GrEნ fIK>?(ȊXS Nx apE^22$<6\q$*ίԸ\ !SA` >!- "hk]Se<: d@>+0S"*xP`S\*dRNPov/I,00ͷoƈC0k#Bl=Ux*/,猿Dd`Q`公Q J0DMP0)L"SmdTb[{P3Lr|CJŞy]ˋyb{5(֖>Q@R\@B=;kBx9r߸Vf[VX9͑}; & s} I0sXd8L]`QZmXN 9ލU)CJmSy!mCy;1542?2%OmLԤCnErY2bqQ 8Ǹ0G)97}{!|t>qC,Gʞl`No.okID}*dC඼`}Z{ça|jjy@x~.?0ÙT'iVh%"맇J9Ύ쨐x&s,<3ѻ$ M'Z-2*dG׹GRe{1XUunǭ?pMiܾڛ)p$'S7fpmКc 4HԎ}޾`p; ͯ&qdWc ޡsx6xE$טv|ХmF$4JM )&Vi4 Bh]A=Obf'sBF/?Vnb#cquZ-^uoQEC#bEuCFp$%*D >~EFꌚ6sBR{r\=I,a:krWڥۜOX##-)=iy/XnRRBƱ9:^LTõA+Xw$ _3ZŰDza2p1ۙE,+h6>&f܆8q A3#4,B9 {1w3̐W"\'T^*5:b0zC\9OC\G%!/]Ƿ4_hf Ԙ&~'(|8ӆDBPw%NQ~v˩vvZ)0_+qW!㨷ajܡcI&8>y7`mطEI0ZY>œHW|),_;]jeeT.mWԞY'b9 6/c5Y>QGQ%H6} *3u, IUfy[p}}6-7-'I`{9JuʹE77\U\*NiNN{c%k _T6+ n-PpJTn2-mAԟz{i>ay %V烛. tG݌w~)r@CFWMQ,fM;S{Aaʇ~K[lV*miҞ;Hd9ODBpNbsȦP g*,._'>d "?-bG;c#) ‡ SOa-+9j {K L)g9 LFlf/f#.Z*TRW0'LҘ8H@Bdad`BJzqPcDyŋO6ޫ0nc_z3тOJ|R(.e)VaqL6XZbrQ{ s٨c: p槅bs8[ pAV4p t670B/d.f?XNiqDY'bED °}|hMJ[ohuO #+/JvxOTxBZӞڌɫb¶<,!'l-VO[z*Z@s )l̞duFTN>0F&[ L1F#rr t)j`tK(c VRq.^g+"uGXOv %HW&h-tOtuWY={ۡIpGXTm %z{^-sΛCB=~(!_I'k(qMԺq-fimZ4u`m) =a]Z IVAO<\ ӶzYh†Db-H Y ,ZS,q懕6Q m}._`9ϰV)WXO6 a7?;:HzXo+tT1ga1"t{ZU4ڠ* ߅:ˉB(tMKsLsck~BQ_rv0V#a"+bU!mE]1hB2KӚ./5BYJ2%1{]ي2NsV2?m0gE9O>4ƻmV&V8)LSOn7\!73}^h6&{l2hMWz CN)3c+[o&ޘbfS|Ld䗗!co'WL"?'VEoM;ۦi~ k5ywa6L+*[liH)<}~r ܴܧՂ`iD obя Be<{K",- 'NLZSQ I=,qOk{u&q5#s4+F$::OpB @/~{19垄 }aۓy`54p35`iq*01/|mk;Ц2ۦM_hkc}"〩Mo$+Z0.E0#n^õhڒy^<)3Y }2|#)"I@=C󵭜cL~rTfg m8nP/B;5V]|YI\d?LZ%c4+#W.l\ho?,%tLf>4پ'SQwiFO`&,Vjx3Kpa눊k!ȣ8-]ko\7+-0LXd,IX`OAe[l'YöVKnv!]0W}/s6~Wi{~pe[edWk*uW)=gBX\ieUz~v=q3k>/(+G|9Y/yszͥ)y#%!v_$v‡/=}d9 :8Ř~1(p.{W~H W8 ಭ[g&l;{Rn5 6oY-ݙz=o2$xWk=к(]fҩ([mzmcd,OLcwrjiW/PYìYwP[4aw|sz?wmPq-Nqۖ?\5  rڪ!\\5^8MI qGYТ;|QԤ. M䳄zq~'5Z| <Ҧ]'j\~HNo'MR³QEg ⷚk TOvƘM!UD^&ֱԬLu&5Ķ#H)&p׶4o1:)4|:.B=L{̇I-;y+.ޢC^ GTJ ZBzO5# j"Pֹ B1J30=FR!8*VNl O>iG{49KFT9^V[6=v6( ,`Vd|޻7X[-\m1j#Nq{oޝqX& ŋwf1Bz Q/|#᠊+CdsFI@6Bq;U!ܔi_ WPk>Yl,uveXTr1`=8`6ڴH0 Z zHvX5.шKB mMyuW\Y>T2+;)gT/Ώʐ )*A) ]lx;&AZ9r\KUཤ{s{:jwc.z}n4Ŋi& {qOj5qAad_k,5 _$)")Uتގ6vW-,f-Dfb| #D<֎=)4u{6M*NUX ܽ} Rw'@Phl rroFnAS[(vr dr|8nz;pq1cdXAls: %0*<ADm& М@!WZب?G8֚,F8KZP@8e@Rq[%P1YH&t5h|( ?%vmy)\ w鿹nSgg|ӅF=''Cx55+r4/KNx_f:;ƴ(A߀?i ])$}~g~kgލM:w-feCEr!H"\ڂHrLGQoH{]7 ՘k\_0Z];G.*n;o.'Lm]X`>x~IitxӺ[~rvsuL▬aٍ -&ц&$c9 C+邳هˆ8xY,aڐ ` [";>7({RQ&axK m8CfVtcOja, ɒAg Dy[>H~v^Mg*DvrSpZN_:֝W+$=];2GfB#,C7*O~xI!dVyQՔ},3K&qrJ,qV7m80,qfHCUO*z-fD0/k8*`VAknkx2|a-`Z^呬lk!oųE-CnNOZc} dxm͇\^W3 hhrUg=U9gHQZ&4%e"b,nR& hoSc8$>Y2n#Eg.)o$2m#Nl[Y Y"!P"smM$Z7!f˟[X)E88/=}0$豆q`*ń7u]s@ Ƞy;|M^5f1 }g>x*ZJ[pqph s)Iof&Q. >\!JW>{2uF?ߝOqNu]6{;NUREUD1Em|=\4bD;KlQ4Ec\RsެL0ބ@s* FO`!ֹ1Iy{l7a[oy \ 68K ^M0ښjliۚN%%[>;+~Nе|R~8QV9jFT]N[֢ q+\m`7LDZu\HI,5ɓ0J8 Gvu嬭R@A:B[8,[!ܻ( Fm  xrM! B-J 15R{h=vÉ˩[3NcM-QwBJs\ٙlB 1Nf`n~RO*MAw8~f޳r8N T@CtS[}r7cs\C>=f qp0q-yM'XY8k01by[Jv&6Tc67ZbÌj[K 2rϓX;thK^3`0 +j*׌6Fq4[b,!Q &)I@ 7&q-eĞ$7IݩtXfM[$N+k[îaMҭ'dVm! ֈR)6&q bV-}]ً5[:D!U)Z[&RJBV"eߖj1\r.s >׾7FNSl)ɹ8ːF/OԱM|ylNgϧnX%P6ڐtAuC/V>qHQ-ZЃ`S*jH$"tr{'@ZJg-_cZJ)^=cgqontO5HPU}8ghxZcv`Uq: P*y*Zel.ָ6Np̦3}jW1d#C1}5l5,"9Wb `ĺh@1k1kʶS)AH=q%rʹ)7ujNeWU3!zȌzXKͪ Ëלa@nӗ +mA7Uk@pK]SwP8,WoujloJ_Z1dMp4J/c @}3Ձr<ζp@(MPv4ɐ墾nd!τQʝ&s=v-5w']hԓr>hly`ݵ!o' cg=d!BmOQb>g Bpd0oN/Sc7 TCVCsB@|4Ϝْ1jknRagags2Ɯg cmhvsׄg:{ODKg`GOo}.x41d86/hM2U ;T$!z'8Z+B/NuV?txuQR:l_Pc]IGrrR-A~-;: 䦳nf<:PPceU[@vUf%;RJ:/w.PLj7<k?{A$ldj^4BKɐM $odވ2_j~;l8(Stv4vY|Tp/SNr2|gLjw ̧~z_hs8w_{s«˫oӋ?)ՂEaRO#E/*Ąmq=w}t)v<qĬ ɋ\t&nOm>z/^iI8 6zs-V}1o\N6܃?˿.2HBt@KX3t`t4#JPN λmr~Qn8~1ڲ eG/)j{`b~p^dtٰ:x?@lT%2w(t9iM9߿2yw_gO>h5q|OUOw#~-}zg˜Ml{2ʛӎ_>Ř[ޯ_gqs=a_ߏXE{ Pwh?-|9`!w].߾}6ʘxw<mgweV{:wK!M7я($~שϖT,PݼV*0X3ƂQ}/njjl%ozT;dGQ W,,i|xgx>F7NVTJnYX}9kX;EoC0lvdXwQ6Ԑ|!iѴ|;Y(I8i8NZPKw tHVt,porΌŴܮtBG/0caVk>gU˞yvƓY.[{䭲CйUE m=! :BwdrC#璷h@Cw vn'-ikŻwh +CR).}bJ޺c(=#8Gވ˜^oCOJ4oh"G9n=f4wd\~$-ag~mko 4w~s>^^`?|LDc#{k9k~]:-tK8epJ뾡*:D.{h^١S}ssXIY)j[=j1^9V+.YG{B.&+"e m ӳMn%d2#KIuFxDrw^ˬ_H?:iFT ʷɃ >{(j V ӏEOf(=PB,nT\JÚ tDgΉ$im\ 3L\I#jJjJ4xF=ܲ|ޭ8iMzmU>3 [K|!?ЭsMkde9项 C~o=w@kㄅUh%D&=bX ɍtPwuNK:8X#iq̢Fve'wI}rʺF!^x=&JG^(I gV R;ҜDzh ݕ<:bmY8ҕ F8THh3S6&ewA&Q2ŒYYaj&y=MSWIV>^2qé ?c@7u*bh1eb%xhép*^)ܟ혻m>qfўhuPKhwSw >I2uHY!Mc\K0)_X֡H1^p*q.px--xl͝5l;UflG֜q*)©.΍-=LIƱiaM.k&dCv#F ns |ghA}&8Af8]0TѬVx-{ZpN>!ͪYλ<hu6K>G=] 6KsV+fYHf t`楰Gߏ;BOHmiҴ@l} ,o#g@_8w`[zuRѲ_c4HBpXy]yk|5K3 s(k-5WS{=y1?p8x71'S>p\l?TQr>QZ> ~~8/g6:ߒ:Y* IG8cHi6aF:2K-zkDh8N_㩸X:dᤴi:Ĥ[JmR1U+]U6cB%֗[.uT6$HH+>R톊h!W\WvX7l9h+L(gˮJ QJUgb|?}V~nͤh,,fyLj4mUE ybJf2r؋py(`ZΤSizW{vӜZ!“'^4>6\%n,\uQ]J2)FQ@-H  X*O>$~>H]CAI)hE(:r)Pk`E-H\gS-{e].ZvfoBپ}(|SѝLSB7zL^gJTV2Bg"9;T,v xeYU(A[tkrD{*$LұM`۔3Ēݢm.'aCvk}Z'SS#xmDeˮl&qbqЎtk05g^fc m$4V4ci|p9Jwu;q w5|'ВNsRmf4ZoJ[ ,E+^T,opuԖH+"c͆SIؼUQzN眜*_FYWP3t#+-$jHR[!Qٷh=]*YvKg4Qplcb1j]&P&iN2HOs1JNEPC/lsiHxIFx9m2bKVdixJ5di,mTLzURR`؇,!F1fu9 h)+Qt͍׾_1@ކk6O;d۲%-Ibɫ9yz=cw 9);q Vj(bJ[!bffIiܐ=:M8]?tTLFi[]_!Nȩ-dbH1w;&NtZNi# 7^WG\fe\fe9][o`M`G-ERqNsO]Qmq R,-GEHyzN| rg5%j2HʝcWY)~ŏn#ȥi n$n <{r\nʈ-ƺB-{;xϯ풌i8e@pǂ]!8_ +f`s.$uwophA 4`c8qJvD8 \m=zE;7MhlTx&ӻO?M-Legy:?Sp}u55`]&XկuTkDvg^>SH`V -=8q"uI)p ,YCbEM8'GnifƮ؆s*#w&wݵgy@輿z"4BgB9R hD8 ]s.lUt|%~&W%>H%T ] 0xz= ;hwy xfœݘcA6GJoP$8\nB%yhB a6^aBQwx8L30n?0yQ(&Y7|Jߓ~-9#fN4J[G(-,RuT3}`@Y |ˣ}6* #[ 0_:Lv%*-h)!KR<ԌG`.jZrf3ҭ&^'Dr)ڞi'c /V"-.9ߚRX##Xć,g6,l|6SJŊsgpŹIUq2gxx}O σ'Y)+Z E"6kEbvXbR-g(-]22o> ꤛ [C%&/d9t,?`|=nЧqÀݣw=;K{>go7=χ끹3CG;(ƼgTr!!\ K:,qUgl=q֨%R,ߧ~>8@ XGے8 Z6~VJ`(-:#^\xbw#TE Ӗ B#kʅV6Jf-8s$NUBoϫ>vb?Bw}>;\F%;uxtRqq[aŖ "<(:#\!#OgC֋RurZtXJ -ղj&2{^,|b$F@ U#bK>IPnY{ @Ж7nARљˢɉ4r{kXx&fQ 8Bh INBCDc à@9+65PL<kJ ǂeBuqV<8 yÈ 8zX v薤H3XYSpqƑgrLŻƨrw,!z\ 8WٷyđJast{/k JH+VPrԻ%q|&I#Ȗ0VZt \Jz*lY6s٘sl`ޚ`Oc^ psF!9jbJ&t|k:+52V-L̓o7ǖ+-/])רtVL 5h&a@补> B $u(>w!I]rҽ) z٧eb#Fl?|еQZGÀ}( .xٳAy~etvr &K:c& KW -`K*x~K x?a/9z>wVb-FWa3/u@xQgܼgJ?y18<]dGU 8<3S5k+՟{U`xt,[<}b"9)c(#+ 芊";>?DA] tJ=y1zLDfab=I5)DbYO\ba+h9LtX2F`[~K $#)Hxqru@7/y^Fܠ|:R }ХB c6hwDn{"ld$ x}Ƥ5'NKÙG9ӊndEU_tJLt# 0G+2s# RYbjb 7 [|xɆGB|(s?GvƼ0ghҔyl/ #kjL0 $—I $.$b7E7lta 'T*b<]b ׵U-.Zb*1Nlt w1t>y1"Y$Nrtricܑ5ga ׯ1 cu>CӶ-4 YCĴmnc(N}9DmHVl ޷Q7h@;ȲE>Psc7Yxqvqv_cndWO'j\b϶_RT4+ 9R*$Q-xVƿEm/T!Rͧ¬t@ S84@yeD nYIlE4ˈ+ LEZc X R\F J VVF܋!]D|^\Du kө0*hX{sOQK[,6nD 'z!z4=X!cI)F6wJ Kd}X{KYԿWn(ɒT7ر]nk.7 0j %⬱nv[\F:F,j%2*nS6~,XY# a Ym9Gq fxQZ)!)ci.e3  D[KMS,J@8Ϯ#RIKI2Ҧjl9 }vH@L-+,H%nOɀ_ku<Dhi̋ l RAz44^@HLLk"*pu%z䆮_.Si"g\c3Cg`?ԷX_Z7[PZ~뱏UZXKqdhq*dr ,_Q$M0m7d uLDdCק d lF/ t&9B: #f!:K(ih.Zg̾:s%8j)`OCll|}9]7@.k̝,:9Ot$qB5[")HV\kYnK̟~rlvMۦ>J^l$y䚳3ڊڢ rÖkǼ>࿝N5mѿn>9|DOkkg19AP,i6pgEq"Ӥ%tcg2 ry".j벃(∔TVÐCR5Zܒ; JTuԄu5i\!T'[t~g [Ey6z?ʂ;rFp͆tVhkm:(3oطb86bS(,(n:Q5P-!.'އ @,ꖦrU>yagږFx3O>_B@)j F#hmZZ }L(I~XF cpOu}959ϓ!/s0Go1]̹xxI濐V 9߾CޗZ29Dfc;2bͨKgwH_岇Wa`>I.w`'w v( _pKjjI,~-*UG\h h5L8ZG)b2θMy|D9rB",sБK1]]hׇq뛣1d<ĈԘnCR &⩞V $!W:0hЌE8"1y\jqSXyNܪ A@pB^ZX@i0NH+逫M8UQ Dј~?NJf6hU뾙B$~z SUG3+Ϟ֣> *fH,W@o) _O 9fRiSJSGqКsX xeLa)cIo*= *K@gu3J}Ѡ91Ek0v0AbyDjZ[ sJ>SkZ6 I\׶_hxGv0|m Rצ'9$_9ښ7*p,>Gi4׋T(9>W:&T2|io.6 VΚJZá2Bu\s$Ƈ1DU0no8c6{5TEݫ{1vRa!~͞wZap^۴afc̆ (;6SrP@cl/e פ )j)WTf F9R5)BT[y&cMB?:ȌR fۦ[tjJdxxj݊vJ<' Dp.#J`WJK\k&mp&g~hW,nFĥ7-1ona 'TɓP)YL#D|'gTj]Jd,]J$y321ȺR)RIFgJk{|9nۖ+˶Q'>}LK8!Uo(9qG2:O7lMO(ݑ@;֐& -9TekH 5[bhv'퐍sv UN iMvTTj8z|y(Q?L0/O>g4!ڨ][HHoFwhoCXEELu#nl݁3%P9iu4 >WB8𝴠Ԧb'ȕԦb6hn]ÝǏJɍA٧Q^+ϕe A7_%mUmǺѮPG)MnI I2l <ׁV;ֽb;isZVͯ,0*//!NEش]Y"me'_~~SZ{sҌ=4li-c XWt>-5ӠP(9Lƚg*M$+%W91]qACو:+.(=YfWhf < WsG%R*JPy#[N}2s@q?AkAׇ6WGBc<+߾초J5jZ.ZEt&($뗵h@I*Di. $tTPX@ZxaFCn~x}@ID՞1'Ku5#4Sϳrm{17iY]SIȆ1TU[(Fh7}w=< >)BI$G=/0|xk ZI-$3*%46[xJPRAin)-]UZ?}edЇXxrs6N)֡_ wX~Wu3ww7,moH 1j]e֏1Gz">9v{?ˌ$WwM 6\X-yu3أ=v}y= ?r4smYWIp Wwo,{ƼE'×X5x Xn&nG/$UB*&w=ZmύtuF=AR? ˷۸|gQw_X&*}v{7<{AjؑdNmƏ󵝐؁tW2ZF9*2R^S ,;A9M EdStS1wOl7.yӛgoܜs^rO<OB_^Q<ܗlkl~K'bj= ` /: /~ OFX9=ld-G9UvQ3W Ct;T{?SwzTQ%wvTm*Ү{iLµ=Z+&)\\\zⱷ8:EcXl(& .cv"ޢ+Ҍ5BQKMyvqI 7vg&sK+'Gl ʼ=P^,b5E*U`UUJ_Rb4͝|YYg?/S2(+C;-ibTpg8aL/*$ȸy&YdxtDN<GNMްY); +Y > pXKRT[y;*6OmrlĢ5x#̼?u q#uoD )Z5bQPb PspktsB }yF89E)!ZCwBںqNۇt,SFIΉ9T5#?R<0@>kɁxּ8u[䁦vwXp|D~y)*ey_y(ȱ3Wd$[%O7bT߻Zv6`1C&jk,S^v ߧ7R؇MREDZ-\H KaЎS#}QU1ڗoFp{%_pq}s9)o6yeQ`n%LWԥ5M-MvQe5-j6J*^\/ElL>(!4qeЂ #6Y l+gaAuЏoS+x؊$ճ⫭i$OwYuZ"Xotq5=.< -u % "h'yH/ga+ʤ1ޢ5kW3JWgm*Rnж /Cw(M^ 7R9߻rtә^}>M<o/)RƵ_-s%Bɒj_?@_2vg~sʝno>ܠu}ύ=|W~&mJ>T'3$x`6rjb;:aҴZ('|H_ j?(R 7֠\*B3/$o-xȪ=y(!,e\L=dJ.Oah*ة2F@|a7ls2݉v~NK&P549&]h9Dr0E9ڄaIp]&\W j}j;L2ZpT} \xNxD5]Ղi87 !VW9*X]C1)F U"V%U /EKD*y:eqYdҪ8ƣEU1/ߘ%7FGV3hc~Bggw槇-:O}Űx2y)#pk L*])*I5(HH Һ  2/ףWɽ$UexbI5T`hJJab_~ wcIbW]]ۛS(erf]0S.qL)iB=kħJQziI#%n7}=["q{5Un8;J Z$A5QD!˥&80S!# h0TƒrK10MIs"^FdFP-0C}tP.*vV#j\IEح[hV5(1MTEՁ ^iʽ.e,QI`>U0 [HV`V3sT2 qS$@fq@ɖw#,rIUpJhY#Jig:!x%MIU?{pZZϢJbVDuk sb/rRv[55lG}Vi߃ amVNͭ F)@?R3E$zx{/vlMzY\,N^u0%7G0s \dp5*h43b)AQDҗԭIP:}ޤ2afuOrMWEn!~&mհ0;.b BɌch  c;.ٔ%`k$.M8:<^G﹨[6Aj@lEuLg*ZZ."vh-Ov')J7>9E<&'PqU'2N'].z bxg>jФ/OՐci + E+~R!_?4yGfycq)غwIZFsqI74  УqA$?wUlz0VsSNё2??7jh W\w{:I.1\rtR*|W$D{02dYBIq@X D FAt\Q5I(~?ihzNQ-ZioxFqi <3sFH0rJU%kA_;yt 5٪up=}xy.-) =En.#Dp2BgYf QXgi 1.EiւGBshN@VAIh29ZhO-#議1kNS͌ҀiN 8cAtA55@*r "ѧt'): # $iQ3bDV+T0 {ͽe) "}n_܍֧M˭ұLKJ#5L4V*ӂy* nW.׌n=7qU2b ED/n6^f/ߘ߅kT??t{J,{OJny.fNaF= ~w;5hZ+xW')gk*A\@$рѹmtNSWP 蒧?ޭLΫ2VRE3X[1_$P^dk>[1 Ҡ0B_~{gon>}_Mu?{ 0N#ҭ}ݓЮ=7`Z ;8$펳; `9IAvpzZЯk1ӔC]Ola45!~:@ {1vѼՅn'!5'( ݪpt8p Gr ʖĤ::G%sGXZ&(E<0+v Y")H9K̽p4w;Bu#xMx캟o8\ ]Pqp.3SD !!^?""<5LdP3'#? @iR \ p|55K^B{?Uvzv:,Ix^,R'%\b2DH<2UT$) 1-"2g zOl4~Zd"BYCT:+4UT0S pnP kaͭ? Ndc4!"Kie^WU?G,\FL@m֬Lf ϔEC|$m*> }QYx7TƥtZ"f}VC*[̆s`s`=߲Nb,@ܭ{r{İvE# /{+D' t.v_k { QN Нz!P> ooE@ֆ!nHR$M+jGp$O$dFk5lgW9)W?xˡ=>/]/>aEoh|5E6=~9Lv8{ t鷣s6 Oo_q4y&h||ͷ,tn҃t7c[^].:Υ:vـu6裿>38Qs} //A8:9>ކSsWe37+φ >FFmחܺ닀pzͯ8GWj?07ǧ=OFuyb3RJ=;}wvvqM͍ Pb8n NNL^q~btr6j^Oȑ`{T>%xx,^ñ ;aZGOx<_soj0ܔfM6WDc'#n'LSMvD{&unעxxTa~7_d'kl]j.N9m,M7)?Ɪ}#?LkF-xv[dӇ]-YY:_#N0g7N0؝o޸=Nh ~Tb)#=}@W==݈Wo pyE(e2 X㮁mDJq`ƌk$h-nb\"+J%ReT'm]XryFY"s{]tױ)e}D1Kx=ro]cƶ= ^kR‹OE#FlD#sa.G9#zFFPh#+RAXv-Ùzt14]4PY wY$A;OS'g;%G$(B{mB> ~/t -Qo -vB,\Q mz۪$cL!6%i-g;<*ϓWY6n_|ښj f y"'̍v0ivһncjӤeYMIInޔtґ^t-%i, [O'{wmRTvzι}$mQ ΫUEQ3qv߷Ff r;֝%ߖRDcLBr0Q"&Ý\I'xM%)Ɯt (s,BZyJ1͔ z#Z0Iɠ@ּ\e\uVAYAI᭵Fsj$-lP3Ǖ>'8 r]H Z@I\=06jx%-wVgbv"<$dHEq0hpB\L E^ Jj+xZcHFm~<.3 /ybb8[Zx On8 L8*2FKV F<>qi!!!rR""QW9g-MTjI5"XJ_/}%0(dk,Vr#j(7egYrV(g>jfnYDJM21aLJCˍA_Hy+6J䤸bu*AI,S2)p5?Y˭hHHĒdEiF W7B14ȂE ZT5RlB]<ۑ 'TwXuK.X'R›L)nD> "vT]QY:uvė v$81<#Ȁ0.O|*^Qzv0lCs+^n9A)h?;8)69.{Vf9% Z61ޖ96ep+!v6*y%bQupzѸT!c _ 3_v3b%:kRg5%V\&ڀw#e8Ucl|Bt-TS+xQ [vdG\60볂}cJKmuvF{J=ΥJyeXaE$ʒֆh4$E؀D8f Ir)guF4I߅P.w n@hDL zN=4z`4( )/A5`*S(L3Mz:=3R7VVîit:66EȨL6(\+,zyپY@ZkDp=AkDrn!zQ6ʫmEkA0+T@DFmX<m./^·.|ky__/Qǟ_6#G^EaqFh+(%W! Q[  ܌X,ēyZW8_OOsѮ{pegi@1`7=_aknH'hz5taCl 8 F*4XÏ9;?FtӴf8=-+ZGQJwSsF?v 20_OpIDՙˮn?)e8.m2c%G Gۢ/ky ہ]ܽ׈`x 6qtY&狋mQ\(U8 :2dT @ B4^5T -Šh#:h376i[4"#9#AW(%a R`,φP;ų =.?ςBK)/ Y DQFP>acYݕ}ɽ |p_=gc} B,ρ7BުޢׂAA.wm59ݣ>pn*XK+t d(PK%B5vJ,oΌ', f'a xF|DQxH  et[ {ͳ C   M~brJȇJC˨`ԃ0`0-Re_: LJؚe;"vE)b 5A$D"QwpEJO$ 6R!%45_-,%\uSґzˇD0u8ӏȨ$>r}:`} 6ƹ䎱EѤS$hǝWF(7WQ9 5H#1. ˽F|Lr()ҁCSdPL)CopmG%awiXO?rvVۙHJòV&憀,ƀ - ȪY`<ZgX)w'T\mSQ+ x|ZߛDb^DQ%JQK+g$pbA2M ZJk2u29ƫ(ENF8=u࢟KErvm+"]!&^%ziQ:~dH`O/}jsxc.2TUe3*E=8a[XRYcY| s.8 YPr!%ȁhQ^5t.MjDiFj5 d@BD+gt9]LFUKJHM6@rtl,c[W 0PO)%lUrD ܊`;.ngV!i \* GH7K2g[ԮE‘JIbNZͬBNn?CJ>ӉLlmc[FbVM.2Z2Z Ć/m+|<~3n3| :u ۝a+Jv[6)40I;XbH#w݅05G*k~<[YkmaWe[fu{!G 8-E/z2WDԃ',[$]E{-q}KId*티[w9oDB 1nTJQj]_)xu?k^3Jlɔj=C\3n;^2!zwF`ۤwBRL"u޹EkJdLʂ" Bo2L-RX]/~jH(6҈rZ~aЈZ"mhDMa"/{, w$MJ|U/_RBk8bN`;FD%Svo'ZQ8exQC]#O$jpz'HrhAZ5e"H rH&+QmPw(pjlb|MYN/AfEGV\tF]hۢXgMQ52{AݻaoI/vܽ{(}XU>Ow8=. Hxfw\vy bp&N7x0̿uLq 0w8\{nafڇ5/6w>S=Z¯eàHIwruYiw=|9h5ē/;WeC.FC jZ *un| _$-<) %㕹>}]]5L|aL8:[6[eyR<'/ ?F}.w}4WAhΣGy=ha1Cܡ>vx1ˊSWV9m 4je1t,aIKW:bX*Uʝ>#ϔ3nRF,/R6:'aZ9ѭv:]ZT+R]T&j!.^IK)*zHߦWz EQ( XjA ] +LA E5JȜH JkYaAP.kgs9G_-R\/_Wt=Pn6ԭMU9-澻˖7ݬy ռ;6GclZ[A>oݖKǪLޟ\[&.E OyW^ºW.>P=GGj.y{ҐW1:J}am'nN!mB dh-vuCC^ys8/Lhb1Q1B۔ͺE3kАWg:EMOcBֱ]yɓ7b] kuqf_~>̨|oKWֵU"wnty3'n=*7x:_o*l}@_?h8m%5f&1^KvR:ɕ4Luzq!+RC&FuJBY>K'4<\@,IPcu%"e>*oK㪨 nNݧ+I BYF O 8dL8f(̌I9.)I|yW:l+b˥rְJer(Xh837[ FeypS0-%zg״[LMh ы)IC `0RW!Մ HhR(F"%ʴR%uT"%x&a0 1B tj?˕1ź!:TJG-Jb:Zny1WG=9/<]!EhOM'B2ӣ\RwI@I\>߮e=~Qz$..ĽncΐkNsC լVlԔ0ܜ}:ãDžHK *xaݭwjXRl&԰bHK<(_+>36"?%$%ou Ò77(8No @֊8ݥ)̥;-DT B0d-o$+ T*T"GH)&R3:BH kVbEߝj"0χK)u C9[բo|(ʔiP`R $LFLafX gŭ֚@YHgsH+'r鬪PRƢ=ߔ4䕫hNzu廴nRYXH T'r 67#,YY>֭ y*S㼰n#e,Չ#ĺMдV>ghͨuCC^g"+Kȳރ ӧg 4+I$BehbSM3Yےb(=oj2k[x~Y#?\T ֆ̦,zHxB~9/5|RIAXsrv_< "z7/>eUj4;gZ@iN]&CjmMǡ_Jjz#)HZVo iy*5T_&%I?ճ%&߀/B~9I%;!b?~i>R3%'3Dh% :_ {,]dU @Z@/̱2%V,Z\N6W f:]Ҥ^pn y fV1%v^W_22\nlm(>H>XHAK" Ig$]}oF]Ly2QqzYS4w瘧[WB N7'%MJDV.yۨ #HJ6}h)<Z fC5`!ע#$aEa ?l z[Sاhga@AҴ' 4Zp4oA>y0(՞^ٳw\#(Ӷ_:mv kMh ç H2!FqcbZͩI M-A=Eg3/[;fL K~7zx^P!҂=^`ewXY==?Cy̿fJIqfJCGou9l/7S͔@;{iV}%dk3  -yGk}ta#LMwUѱˆTfC궅bF`eaB:UEn\12H Ru~TKpݩׂ5cd`-V}XIaW=}/~4DA(wpiߘ`,l^dMQ6Ie4Ӫ^eY^ xa3pS)H&H.0kwkZ.*>3'5CpAB_yޟSDoЄ=*b1i;FH C xCC^FG];DDTnN!\aAu SVy\m$N}$S<˙UK RPtj(Ͳ K%S:Ka/\hi$ Ns,V֤&PĒhAE$ݩf7{N5EDtǎ]LgSLkA.ӞEU4JJriݨEX |T'v6w2efݢn}h+W("buS){3Gb6r"hq̱zfAC^)v1FB zRʭ6|fGA FysLZ#yfJY+ NaUti0fE$KQJbEL,$ F"2BL9y#vcb|rwP :X?;O7Ϗ7_a֧_۾o a%IϿ&ϖ ON:t>9|Kg@}B~[b}ܹyzt9WOuxyap1 FRhBW=yCxLmbg6c^gqinlkL 0qKxw5\"?b <=. $ I19!,hQd(&{hPM,WMR|>e\]-hc˃-cr1`^07ZJ(/- 8dP?mxɠH*6^2hwk Z==+4DR| I%yAy*r- *rpIQ+ ' I4|"VJaQyy2עbp{9՜dOxv!708Lx70[j>4䕫hN1f\Z7[,>;Fu;Q-Q_ևrҩۈu)"1JZ#(j] #nьzU!\Elz]:Af`0$lШÁݽi#w7T0)θ z 7Bk)H.Дk Ob-Hs4\}) mLʓyu+(e%חHꇟ~YBK)mBI~>U@$y&}rz`yBuF4K2t_eeڇg_!MCLĥXd^bۯX$ґbԜX-9:J(bv݄22 "abr=/ƫOK*N5Sz S̓^ERsz`Sx]D٬'},Vnդ%!, Psv#p,\Ufq]*R2=Z]hnvU*ZQӦ^ ]I8OِUِt8YEӓıbi GwǺOju/_e**k؀7w/t=a19G2S'|lL\.Xt?ޚ.zuҠr@N1̑5zL)YbU)6ծ'DifrC6vҞw@0EAK98|*q>UG1(B餎C!ܝ.Ж@UO?D~ sAT O'xK!ruY:Wfdl|`;+|m;aov( j _aWfTStߒ xt}ٻ&7r#WlýI3 iP1l1ZB݉*6YĪƲH$L{ou=idOɈ8#G1ė'exVgڷ[ڍlWҳyH;5rGhBj@> ә3PWE5dJc\k_jP`՘l}r)q#փX'{εA r&\6BƑ2pRA Nlc%sZq&HSDc1pd%rIEԱAWY89#[p7`3$6s]K5LJYI7(hxg~TN&1?i(zlGhR eҗ3ߙ,_rXL[>s9{oFCq74N;%UǰqTu&q/we)F6->&dZ~%Ao7"Rn֭p2˧=|"]tc4HLT&E7>!x9}̱>(ٞV]=qNDWQZz}7@*/@~Z] "z֣Ǣ'u~EgJYF0-%$?32ŷp~g&+.cx>t@3H#<G)5·~GR `;Q>Q:Ju?irSL&ԼEFQ1H n?xq0,s3'1΋x89f( > 60߯uDalV·`'[߁fLϔTXg!g)E-j/|fg/b@b2ϝ^KEAy1IGJ2:{^n:70/@Wv{kϏtv%RBᑀ1qbX)-c<1yi)S%05x-ld7;e.4͙)L4#,bJϖLeÎbW>XٓrmicE +8aBp@4}lkj*BbqVT eOLJnC "ZEk} ɔ *C^P zʩc | JK"*;K$IzHY {ݔ"S!.vҮ\.Ç@gO>Q"KrmDa~MDvAD#&ssbqYwCXbl:&<)xHrar~ѫmKe"5hMgJ_ jmhDJ`Hȃ'X C6Sz< Cu;7a8HQɀDXs~hC?̆ ڤ|j11S7ZLr1Ÿ~B >v}bĎu5`:_Lx8fcyvs~po&G<28=ě0No,oޮG7ȽE0OO;-~ ?mC!1w7?W~3|ki>&5f}zvvKd{C gH]ǖLf̰XiƭVc}pSνp *Ɍ˸)al={]y0H$+)br~̊A34/֩_{PĔR n*T*6T+An7EX&xD?Nfft%{EQygC?(%p5dugot2 3~6dNUf:2}TʬUf2kW^o+36zr۫~j6BDm$:!#c(aĔDᲝB^Jb('%W'ґVQ84Dro<1&D$b+윂 a$N"pڅkoٍR!2isp N QB" q1\T&jϩTTJ-Z\KUbjgM ,QQ~>xy*3z[o~tɌ>tJqf d/⭷_4\s{9ݿsgt vt ^N>U6 ?K[}76e&xV\Sl=MTNۍ;oFǛo{I͌d)X +w>{& x mh}bG( 4}G%w@bnyejE]},&ہ/ƠZ4#Tsgj= 4 XAƙr_aDw\9EH QLU)njs"7f[]<ȮO޻J 7gv"tCoZ+=2~#EY6ֻ5МLmjD& KD8bQ:2ux{`T9}X\QЧfҁ(S?p}'U!0't=QJҜ$݆XO/-V0)D=Pٰ7ޤ9xVPg.d5E^v+A)Fv>1pJucڭ@j*$䙋h)${ڍLsn%1ȵ0lv}&pps[ف*g.edPq=&ۭ,DdQ)Eiͭ*g.dqk7Q mv;Z FJ4vBB)A/,dOI%hmn<ߒPukͫݪg.;2Er GHL!2Zi[$˩3 y3y;v28  $#6n!4Y.0WQ(U TD&(P#N\@L|w a!JQ8LM$L(204mz?L=KNF v-] 7Fiq!fiO`FjLdܙa`lbܴmsa8Dg.^ O"L\N;>{Ͱz\+E0]5"sΪͧQ9TyzTң4'ϰ``ÒĒA$~X>RRHHe=,uUBJ-gx[kX+ZdPHcط? O#KFh(e$=--EJ|dÁEovbiv4\ՠHp.hP%2M6["[Mj "!" 3dV66cw:I`~SZ-hC$Aҹ@i>Zg/Z`쫓 q=Y..=X>>OHf} C5ޛĽh>ޕbn(dI|#x/Q9E"1`#Tx.laOt=UWas-`dST 'g-LS*SglD TqQgFϩ12V(af4\>Nc4&ed5a$IRo#Fpi!NB:R5MBFga=\ S%ݧ3r%qzJ(mGf78$h~F K3gPSqRrbdx %X Q8!)Qt$PA&1 ~ҔDʀ6[>Fg_6mF\&:7$G}nMCЌQR6!7zvKsï7s8<o&15G/ы!.±Gt$EDDv%i8,7Y&lr6T|9"Ћ(# m9L9WCBWǒe Ko,Eeu&'Kᴍٔu6n1M.T Q lS΂juҸNh#o$wq݉BivkP$(5jcjEo+uGr㕀"0|wQY\AHU2tp_#l ܒJDqWH4LV-C!3itUom:5p̓$)%J8chbSx?!d1eU($'OaoͮA8NKyT"cg߮uIN#nUm*f>.FvtI21$hn0*rpYѶv.T7IR…F~>(z27A sA"4Q::8 Î3NzיTxi|pK>DR WEEC*ڱPTa2ٔ9l9vSoUo4` .9а[BU.Ғ`jHraNj9~Oe`6p΄ik34d.;baRO6c:YhQK6.nZxq3f7~KDF)& K18:ə&D%+^tO :) R%NU(3>8>3>2>-smKlej/F= E,d;23QC;qfJSw(^/BL]V~`O)bg@{\a͓t^bW̏,G8pS:KV|A B&@r켮y}A2DTЅvF{;8hd(-j8=:| 1i…e o'Xa}6 AV1_^T/ZgXc?!hd۾gcoVM(2@8!@&I8cPۙ}'ׄ3AC:BܻO 6>X6@ d= J$&0q ޚl6X$AY^ 3MHц@Ap'#敠@O&q ԇih*!UF /7aDPnC75 t2ʒdUol! :$?ͷqfk%=9Y QD1;vJ cP5$1(Р״KVjN~\?B{P8c .=~^'tm@vIum~|iwS ˛Wֱ_#e}tږgo ڔr\mlG~?^]kŤvWj^BolZdog܀r1OSW{b1iE8O^>W]6%.;-o>KW]fQol斚Nu>W>˩`lG;|umomum7Cu7mLaog{(_o'jf~{g 6+k`Jѽٻ I_NZ;t;9ƔH0B~Y~x>AhHhqTM#{|7yRt^Zz"dChf|HiauV Y7fu{6 :}C5ײ>bA\ȸ詒iYހC P #ZafǖA RM6=p$w kTsθ{ ͧ3OHf1ۢ$:X eT'݆$ShSudた +gC9K4mFy-:wHr@1CaFH Ґ.zvl,Bw:;jP4-xPƐ<}>ѤҢ*\1bsZ ټ> uHk^ M{ZrN9N6$]Tw:8! '@Vo A,~i<# e͝Q_KN,iZMOi@h̉3"28QʫC.[R#,{7Q'`%5]Z$ X?UrÃ-H~5ycV` .3x9$A#]ŇԆÍxqj<KC% 6 !F0%H>5̵I%/jz߀2U%g=g.z{nK[Eƣ4Rm5lwTjQ!mS]s@d5,?z/h:B5g: d@҄q3F 0<FX"?f)Í4PIVN+ 7OId&AiXJhiFj6dĭP:%L+ qh!Cݟigy¥1%c\h&Ԉ#aG'j3Y{ 'r|28nC}J fĞR>sG utr=+%cŤ&PQ?W}p1 Ϊ-0ePSh2ͼ;PlTj&9ґ <=oL-2!bi$&cE@m'L *Q42d"JSFbdž4'#`1( 21EךɃ etx !T*Q_?NDAh-7hʙJcVHLr9bk4Y5t} } r:$ q3baehe4kU yVBI6" tbKf$ ~. A`LBܝ!{T@ДN=W۔dd5YQ]ϑօmM=E蔞5|R"PdUXI- 1F5 8&ءLuȺdR#L{̧'3yr1#tOl. Q}RxbUE룳i^5N'g8s7 ӥJxAźvuw"Ueբ6pΕ̲s~owzQǟޯ&O'Q}cLu5C+ FF`s}bU@Y6;z 땲Tf'A*FDA\5*/:FD(X+l}Y-׿}_~6R~[ed5[ hR<++ji͐^IMB#ԕvK"MjFpm5nj)6%|ݴF{궿Z&°G-z;K>+knH]T݇"hi;n yn;o@%XVh[$/+3*YUΈ{?؅ScFV긆'pV" uuv_Uf ?uXI|w{W/9q ~e+cfMRHF-j Jk|V!G l#;FO:}%sů~ U!p68Ʉlǰdjp'&QX%/9'?Uh=da^]|םИA|O4z Tr8?BESxicԳ@ ڢ5yi{L:`i3f}ccu@qk;x|я|v y׵KAp eP[sZ>*;#AusGNt,eBG='+!ku( JDhb֜XR\8\\Փ񳷯\5k${|ZA ZDsfVw8)*GRTl/IքUBs#!FIQ,<z#=f y,s{O>L6` ӡW"R2qp?me*I הۗWGTtet$( :a( u NU%wmHRb"2MPGG >OSAZQ +f 4Nr;}8t=WR"V: xpqmf*hZɓB(`Q1 ,e$]yr$&ydžrА1+I! q"Ra&82`lʴfiJ$I :+X ~PyܝLf } G/pr岀?z! 񭠰h(r^U1r簋 }fX'k1⵱Mx7fŤ>>'o!,MQ5Mjf1Ij)M:I6 46j˷|iJZQ8i=Eݨ}^yu떼PP4phneHҶgRXڊ B \CXnrKN9=|@K<6 T㔏{$%-ӊ',yDR 2KAζ8`ԣ8m<8ZYNwՀ)`Y lj#e/JՈ8uF4uU!CT1_#Ken2TzKZC7sU.$^F=|ϱFN\ÚT 7JJd)|҈JF _\!Bwˉ/N(U:K_ůDu*iu˼5ckޛ>k^#*^?S#ZDͣpF \(Q0yLU"xHA&qJ)Vc04ZUo!ۙu[^eZѹL u4w[T!!_Ta.ݼ=YkvhMqx۟[n`-…9mY  /\e ՂU1Ss4UH.2E+3xJkdcV3yu`0]Ufw۳ׂmceLA/Ru>ˊ\ 5%>t itUVaZ5-P+ hӳAcbAn!B\SUtM?[AmNJXL7}3#+ "b(I we7U%== s]Z@Yo4lnbۥ^/Vɸާ=a c~$!yzJ bnPkO tS_+b65O(~8¡k UR 𱵃޵C ud/>8a aQ"R(`!Á( qَe;>hsh/b<}F񚀹5&iw}/I' Ϸsi,WGSe\2\v&=kNqv Rsb"5+{30_+y\ԝ?,xǍ\\ARL]1HsJȩjdo`r{*ZM09V۾r(N^.1Z(vEK vs2D/{E9`%5( vP imN-Ab.Н@s'q$>gb6+,=pa$#14M‰24Qߧ0am" mD-76gdK_ct'YD xMh74vߑﻅf+W? ). ;6??7 xڂAN;p6A\q1v{f!R\p%Yv& j(c\t[wظ睶(Ӆ#5S0+]AF8bR ˆYk=d:w[t\;~g3qL&HB8d4ԳwSQNBoE~=mit@6'M[3|>+_| u`:O3''#Wgs3I [:8 qƃɒ\ͫ{^}k=3vVzJA *+L:HS{߼fL h6 NzZGPf駫!T*"xHFq|Ԭ#ԜN ߟp DrI]LOma4K`?F> UC!2 E2TGgi0Ͻlnl\`s2sW } h;i( #d "HJ0${הrhV$̇3{Z6~X dn0H1%ʁ9DO(a0.fiDsed)HS7 :q7۲~` 29"9a BѠi6_'^ 5X!Sc&)>Uů'vBOׁ1^!zUh^Q>e|Rskڀ_\"g?*+)ׂWm.?8ċ~:(5{=ҵ{>BK()UF;PaY!8 DRb]N4[ g 8 Yy݁˞Ee[yJBL!t "kMR[La(lX)"`vBUlAaI5 쟔 I#20$s@^ ҾoIuk0%ͤ<VOp0XHEDfKTq\6QL:`O&ގ?Jh'ނBc8GeeuT9_ܘC:(ݯ;$?<mhe*yqr X m'g]zdMc< } %wr&k#`THOI7WT5$7FIUtW' v|t(KpR) !jjS*>HpVK_s$}q@ǷWȎSN8dgn (b:Ҩī?ʳdbVa4'5k_Wӡ5ʤ1P,2~=kze 0 .NǏ-P'sQ墙NoIYc&VM0x&`1Fkћo\+ڻFs|斠kdlfG`jg-X!~qfOw,?s+ lUIv%*$B3two($ QRt&w?ݪc)눥)n<*C%@Ia+6l\u`vGnWy{l6=wBZօfS'0eԷ,]nKAӡ|_;oY j#F! ;[I.TJpҁQctJ#||eWK)4$h:wR#Qn"͊yyr&lY2'G#/1bYԺZsCN/(1XHX$ `yZj9ș{0(+A3iNUwM h\e4A5oqO\ 3hh}̖77cM˾ct ;x;Vy ز+@E&%LSĘDD`Ew 5?-`$S) QԪ㫅JU 2>]lS59/연oaߵ'Kدjn.4r9  +q5Kew S94Ͻ`g="C:\L?KHWO@0|Qz$)7\lSZcl=>s>2a~ۋ[*vKh.&70,o/].FN>,/0:%XSAӠJt!N^z^ P'_0i۩̴R!0F)acBnOI_K Ҿ1KH߆xIg9Tݠ:𦭛CM }?1՝G.9 !rE, =H APK%CJzưtԦhR EZҹ;U**'C=| 5%2LI5W:h=y&&_ D*.c:Kׇ?R6ag7tS_IV {r)Y~P Ћr? G9M%aOF3߹Ҿwr Сkrle9s#d(:- A[ vYFQ,@ 9gX Z.8^q5AVE!Үh9wiۢ%:G!rmCWvVQxe#RBx(JZ4@ 9Hن0R䄌kjO,)4!!O\DdJGF֭{n45X]ie]֭\քQ%`Ϫדf\<ӸQaۂbt.Kx%6g tV<%%Q-94 BU!_cs݄5q0aM_\MXΞKчks%) <]fs"Ak€ʜt d7*W(g{aҊwde M96}',2Եc}fac")ZfxnB>TBgʸaP]ߤ~~ħ?7 9;ִ.ٜ/@!Ivz6M|XإXwr|mٰ.1HBxEXe?K.էMJfy+U7ˏ zuGiYGbBs?]7`$օ&Pn:G{P/4Cg_]EAF0, 7~b5v㡁.cU$5n닫3o^т4rT1 Ab*(Sq( tXUʝŕf=8#b90w7}=~rABd"CN1px/ŴrϽ}wW4Oh5gI%yFKwMX'c{3I81xkΓ:O/7Fm_f> :ϐoYܮƟ/4:0zz8_Jv4Lllo i𾴋'tNׯ>"sZ2'-EwL|.}ėc6#d.P+"={/ -/=hi*e0K wn;[֝4/;8׽9m#/gq"Ds=XHB#F UI,]`w`6="v=o'u50c1܉n:-`D{ߺ`=4qw択40;HC Lv-=AkAh.E?W2!? t92TV(ZG ٛ: dNk4$$^*= Yr1Rl!^!"&cΓoKIܲI>ӣpZak[K\~޲1W H߿g�?QrwWqW?O#2k;HBݣ 藟ߝA/p=Nf}'_dOˣM)(d4M=10f\'A5dp2X#lCeg8pNcƕwy7c7]إ^Ie1$=u|B츕u񔔦4g>re]董eEx=Ҵ"U"|ydQ[w/Ƭ($e/׵ۓ^cMɥ$+OxJPY :J|RZ8վ=;z?9i (mzˍߜA-HM$]]1e5R-pEZʸky4nRljR]|q<0 P%aXøL4\ KkE::i{7S+]M7 7'i.p޴z/nI.2|؎>]-k&{Gb%6p$[{PPj {++nZJ{m5@N!z[vH>)@ M^.D7eԉ)+^9soCϷ7rB;ځZo<rmyiIo']9)e.H܁IO(#F|_Vp2P赡Sn%;fq\l'縰UWdrc% F%u'̇$DL*n !U(w4\_ko=`! ,uᗉKBs(}>1 ,KPG1jjӲöU310Mh7!N˵5  {3LfmX",}m _pA9B2 ]C7I⤌$і}L?IN{Cg]jq[,j{S%_,{V aAtz:[lwGן&_ j&_vzOaRͰN;yyx*vL|Us7_[Lz}kiMy 6qo m{?'fP^8ܙө41D蛣c3y FDS]5Ld>,-&^Oedg/8x/RҮ)][cp΁5}x|eSwl*zbr=e8]w||SOGvÞW o({njb x9%EuLbhuZ[7x "dbJ%^ t R |)ЬDN=*b f[iQ$jR4yj,p"ZUyB+Q'_ ;F륨iSOVuylC!M9TAI_C0dn<)ǛSl[C*"a3l|jpmF<@'}pfo;?4S8ZaKp;Ϟ4VnϽnCj"b pGYj[)Ե!0w8mM-|P!B顧N}k_,@vǫw+Sk'*dɘxxɤ9^>MIRa/]S/;(p9 ؠ>$ ʜfyۗ"ouR0TGbP>Jhx*e“ !s0D;@q4@Vr WqAg¶㊨WH~t nw01U5rR~'eZ&c:>WU󒾂.dm[^.QH5Li@u);@R)z3J4>|VR)3Cn@vA ^V$Hg 'Tf2^oWWۿW 8OUzZ"wu+Mݷ/IGSF8.>.jU^U0W**w##EOHG *A lXL I-1eZJHTإ&XC|r^i)b{Xn3ST$\+`|df6q{&`a.626>ۜ~4qcqm?'͊Vw-<=OoUw6(̔$  H``\QT-wa47P--ZKҼׄ뼄s̢./KV/ygYR ß AQ!.}^0а:B'GBWpP#h'$1 îD%QYTK%Y!Y!K;۸2f^> >̌\BΊ\3B 모 ۛ4>ʬj2t^S-Sw5A a]%TKJ*fU%iVkiF4"X*RݎďƼpRUŗ!Pv"Je,1>#Rfly & H/l@ E]+DKPZ^iy_0?e&enЎ #+cd~-ʨѺ`Tϼsb+群im7}+iT뜝hF4'YΧOuOXN6UH>z}m*,&$uŘo߁prêzJ?yw!.!ވe!l|Y e+rVCfgm b,ԅbuJHix-3E}ۢhlpz~Z:3\[wv>[^ڮmXkju?T {[{ k?73}֦vT{;/5LZ3k[=,Av6okuۼsJ?x|^֙Vz `~ۙXŭxlQ]bVf w/3{zVHhv7qېW`ZWn9Y%@V؄`[qX#cM%~ʄ6Sm%!"q'7rWoP>V,̜fv{%Rd1xZARJ}RTy|;@N;'$Y>ADKPc*AGAϸ7VbOul:1;4FF$+ZLyr4D609# *r$Par$)3B ϼ}'ג!2B9djG#9WnG{7G4G;ahُ"Mm 3恌MSбaI=yZ|k07?-ruXnMp[+xxrcz5a>ڵ={5l8/l~3׷d(sO٬Ep!NjVڣI9y~nfbnd1Iλ}anx'gޡ4JI=cI,Q>wvvJ9zy w“.sD=~%I+=v(Q֞>wvvd< rxٳxpPŠeSgq]0ljn}&=J߁G>V7' U(S"/aúk kAs`Un62ϒ6ҲY6%h%pY]rަ&kh6! 1+fe1jk7P[vcKڢ :&Fmρ1s0, D벡P;1whE*8"-Q1& =bn1+ h+[&=m>f\}rٖVMQC/j t\XO@rќ<TmmqڕP1[vm24Am{[=`? ?G,nv t =okW"\Q$_ުm?.8jVF g}N&Akw4#Dlu^N!?e XKai]Rj@ wSi74:"]w4Anu@ΡL5OP:2Z'#d6M(60d`xkf4lJahxP?=Y`TqRpA`uW'|Q?@mC@ʶIj:u83%Z[H8Z>MVՀ[7+sWT $ɠc`;. *6Fg2^pUk3.&[5J)]ABi~\,n@wa[o.~ Z/ޘ+?Z  ~lğ!d$ % n$M sz+$16r,W0 -;/F@W`wWU ~!{h*j%RnF ̳(jj)VQDJ*E U> ,L!izmy+@tWPH`~<{@!Q&(M"z}5Bƺ(pj2THSB |.Iq ѐEy%ܜT=7o|ۋjqt|35ݼ^ eq BDrۿ[u)kpBҼ\iڰX ΍בr}5Dc ymbZ!"P1גRZh1,(ºp Ȥs.HeIhRMA^kLHCmRrZ80X]]X&P\#b2Ț"V^ψ! @J$(h3ҁ6!Y6J*#>3mDHhڴ, 敁E݁lGl5&z[D)4l`|` d^sFK}JZ4*\>;a|=+.eěDf a|}q??}ćOuqKG 1xwQ>c0v^TR7"6?guo,M9*lgWX.^Mg,X%{'pt{v"ymg+J.Qڊ_ y"$SVvL>hT Nju[w;7n[ y"$S~4}nT6hNM N:vK"Tn9$䅋hLIDtO%/g-*F݉e>4WQ吐.)20)vR]jT Nju[wSϦݒZU吐.I2U*WoٞvSH-*F֝8ťmkVn9$䅋hLa{v@[*UD':ڭH TiVn9$䅋hL1${}݄Ź[*UD':ڭÈ*$gnrH $"Qz{MX10&Qv0R#y>B@+n$䅋hLi:;Fk9F~f *.Qc`֙2AacDQu&(v_ ͌GcQΙIrz-8H"$bh 1aj9ctp)z;zGxΙ O-YgE.IQb3OaYn#ѳc&pYc>ѳ"L[M<(=nTcla%,Kxٻ6kW( ^OwJET<\XAbI~O@rp`09ѡA0 54PaTB*ÜG\[UxDG9 <D)=J7SIRIBC?"E h ALQQ&FJ1#K)r#ZN/A(kB(;T*a@A+ J]&61mЀK? ԄIR&CuMh(eXJ SDv^z󎱨U"3^R7PY 6`x -ʧlF?m=&7<Ӂs Θ9_Q>Eώc%)%坁fE {7JymNU94A(ᷟ[ F-^!WGu58m16Aٽ1-Jxӫƒx&o'{-˿;wzv49wg8db+0܉@CB@w KAZh⨁5CT0U!N99TkV~*EvZ/U9+9vt wioR: $}RȨ VWRy\LVt4Bc(!-̉<5}n!;)!ͫecGx%c͹kN4Ҝpjb@od0Y :iTTIa [t}%!#s41|]r-fe8r|ǢǬfz86 c~jRo~όtUEߢ4>V9JEVAxx`dx0cTK[z~ymo|k{+D|?/ΡJNQ̨:|=m-d5͞姙O[7, j"͝F@ØR@"YM;Km[Q?筼l?J@"eZ <-Pjтpܓ;*g:G& =B`Α=yzå{əi}l0\|7,n1/^?exdOQ]L>ր6sU6m%2Au+ >;b|.#"vȏ g/r+e] ͂Nyŵ/:?С]xvѝmo۟ό[+w~ ^ـZ+ujmBKWEoQAQZ;-+Z! ^Z.^s*.`S1W)f'>F!1Hߋs3;b>CCr27anVٹò5@MVd+fuJ)#")L!El\ރN$/K1Z>$ơvQ4iן"$svrjn[H HL<(icd0C2s  `"'QOFK \7zO zAP*$E֏AEXBRJSZsy;DOk֖Fjd L 29DIvzCu'%jN^06aߝ^F N5 j)/C\ ,R4܅%.u(fSRߡywV$6ЀDJTx/ݲܴ%*޾$FO2jI"+j$KWr+M"+;(M&,MQ*%{ &Ϋ_V¥XA;>ե%oy?yLxF_x6 &xxWyY<]8zrp}o0lA WܙoϾ[ϋߡ8+~dr8즯K^+ߍRk/OG$F4m ehDhZꬴt7& ᬬy|r&fwJkäH%V{eN4'jE{q$h6ݛqףmBEvxWRK'TLP(4.Y/c塟ˢ?<6z#:jn!E ԤHX N %1?'2Σr(*@>j4[ 7,I-G5FHkFz9PyyC!Y97iǙf@x=ў)| ; mi^h 'xr@2N`D${-7inyfC߉kyѰbNEnHFW2L 6Jʉ"4 GWiG?P6!C }";'9W6Z9!h(2i]RY6ʡU(yC+!C;@W1Vp1!TֲsM}!2mc· ZL ]D\7Pؗj&ʳzky)=1yeE>S)_Zv8Z lc̛0rgz2:MTZ]@am`F\ U!@*/JC/w1 S#w/3E{^m9237_c(0&ßd/+}NYkL~~n>orh鱵Ua] N[wmmGr}UN*,;y5WP}{R%uJVsxp8ԚqTC BU.t@g^;mʂwl"u}P u2 5AcU|K`Qd'PlHO{l;$liC=RWo@OɁPOz PO`jv@)ԌzF rz9Sd}qEj"Ŏ{TFI)X<=gBGP01teVnT`eR:1*ّh=%*ECv6FO7%7&lOw8]6%>{&-]w'ylxry|.?>Q={m[!{̊ކ_3]wmڿp bj &͠bݸ|zp9<F&L7f a+/k1ȴb1x=zՒ(9D"DaJS4py3_*}v\D]y+ej A™kZC)ʧ=%?oU+W+ͯv1u]+;~؀# b`ZvGš=?r)7 ںX+ u|Ca_g t(HYeU2`876ʮzVEwf+2,Zάq [і1 \kE@m5Cd!ՓV+ʃI5F9}Мp󤧊x5q80CihPoe՛qVjyȫ!f$wt9?105[i =MK*䔎[RHFJrSG4`Y䓦FxEE©Vom0[)yf0zQmkC| ZiB`\puÎ U^vC^ܻ Wwm44R$_ ,6 ơ)H)$$PH_`J;-~ɝ-^MF{5qN-f7 %"I`pWŅ(yVkߣ(Du?^y`~]ɦJ *!VX~}/.gjr9Lho1Q/1Na(y$DҳEɈ ge!Vr=gT^ܸ xo2<>f/FK!Yi9m#Y7%8nZҸ?RRŇܐՅK* P1n4 gqZ eHdeth;V>U1*x)9'MۙEX_':ZrW%“Vf08<9˜;L<ɽDj-pRK$\$(Ӯ8   /|>)m@>ز._{L֮$Qw3P*Wn!(9iC4̯fCpZb RQωmς`$P?Ԏ)cp[.?<̙!'#RAQTeDWMь(P2N_5av^hQ5 |SE%@&녆dwx.1uR.#xku0(N6lp#l qMJf^YDJܻӞ!L.ȝ[c*dN1|v/FQC{E Ҳ ^!SFsI3vbͳ. -rv&3z}՝M%(7:xT@ m&odUx!5@%rB`] v6T1ȅ6$MT`UZπU[xPYS%P=>,.-Cn 8 l $`mp'i %PkONGp%V{8~|q9,0SC{T޽*DHJ{L1U@H(hBwVue\73ѭKNKlb~i&ӆ{'3-@PlU2t5ip+1+ KPH*-c<@W&5KxMĠɎSҀm-rErrr2x}_ Q;カ{q_2*me%RcW>~IF ^AϮVn |y/}Ky bFJQJoMN-4 sْ˽#!?RREqfL͛L$rtz(H.'mz9z(:w%Fq1/)G˳~Rɽ$B+387-wJkLyݦ(k1ʉE:1Vv_i)Ӕ`i!qgd^2%I!xw>r5%%^keuMX,ϠDIQiBGOJ0^ bJ*:'n^s'j)%#}1GXIqՋ٧wSv@23Q}ԽY\󹠄v&Hy<5O $VR:Pi'ƀ×>鲱Tw~3Z"}:܈F^R@1S^@fs\K@Z%Ď.tޛ^=C%lg~\hr*Rv %r_z7Q0U+ڂj0|\xr)7g=*^Ju# ƿD;dЅ#~r7/6<-@kjMekhj4$N9}>ԗ,_R 4##^`L0iEDg@p'6X"[~iJК~3yFq 'QsDHK'mXWcy&Ve$Z"z/I- W=cxCA,LOOWU]]%ql/PCט$dKLhIٕ X+[dA/5jȵ8 ‹b$ʰ-0y`4&( )0W$xz]\<{u' !)-liS.{NdO# |x%Cw损jL{[3.?+Kz0|Px2h=w!e#48٢SHT0-5`k7lu*/^(OA?i8"4v6x =  г}Ll׃A`ۋ?TBY䚺UniE GFɰ$^Q'Lo\)zQY ^0_d.E} GcfEE4IHUk^VzA8.y4؆[@e=iDj5Fs}mhgqTN]NIbT@LK$HL8"5 (T;k u T,w)aH}A/;`TƋ֌*j0H̬݊fB=DRCj1cHxގi!ɪ}6Oͪ؞(ͭA- LDP'HV'8ϽL`X+ F&Hhis <K#M!$̎q 1J݌{=k$:/N;j} SB5Z A-eko dm&`lMύ 9dJ{ƀ?w̼1v_aq:|@DD A8΅. wJD:YHiWVn1]x+sT-_|GA3&+ +-P,ʮ2f7 k֨PCC`%(} !K? }xvn`ݛNr4[gϕYcp[7`0 R=A<#S" $+Zەx-bTBXG_N3 Al/DRUrG]Nһa_khRJ}$ 8FZvm@?_)s=%x<=n9E$\OgupՍ=8<70)w܋G S;GC_Bst-[i j6OџBf>.iZOl#\fR ֍_w]&|I .5Htu:(^m<oqW=;k'!|@YHQ 轉*)7|fOl7[܍/W 8($X;˱HIX80" [I-J1'`x*bKr`Y Tn GX 0fkcs1,Ab#DlZ$T@K &(aSN!!]9=imL5 g, <!c-GknX5x񮡧 = HZ ;Jǚ pw(XP`D6WF ijXlɵ1cMX)!)J8J{#5e1[ӜΠbk趢 i(ȁbEs0% J@%`a m}:s}/} d-B@LBrDӈVBJ)Ul5U$ {zcps}0bh$\2M,F`%$Kd=t>4CiZL*wVBy5 aJhdQF Fq@&K%R8Bȍn%%ݛ;NsVg6jPPђxЪ ړ^A4LyOXH(HCyyYD=ʽZ9K3@0M2CA7TQ Za rV{᫃Z&q*x#GecWƿawҼiNkw;h"t|)M[p[_nh\K6ۺwZQ\R(+?t͛F?޸ :EhΖb96Won+/ ?/ozC>u]ː9ڧ. o3-k~?D C{/zI|t ob_L,iey.U۫/†K#?<YL202.T _uJ^\^^ OR_Ah u[<~%.ɝ`|@u (Ipq Z,)_ L ٮA:봚Rm}Էj5d{5s7t7 MWwUT(h;24OnGDcخ/+؋ vlوc>ZV2%ѭ:㲛6ydSO~o]c??"vn.i0[v={_#zſF@å(] F_+#Fp$tSq]߅3~0o~[E߿Kaxs.d3Jeeh-PLgQ4Jwj2 EHG(#q̓+ak2 +Gx O7ݲ=wHIIGB~F߶hRf Wݾқg)}'D|,jle&Eܿ}u2^>$BH1śV `,nJ0FΫo2d} eb$ X ?NB~zL0M޼yC#k+C}g6GHzC"G)n&A8CqE(VU\8/HcPHB:( לs\ 4qlc٫PiTbgZUx4c;Y2.Z'{-uV?o)R#ZqH>r Ś\ LC 26ݡxDU\Bc4y;bͷD3'!c}Ҩ)Xڧ(OO8BJ,chjo8W\+tE*c4 {a;|]o3"MKdx-q=Jg q#2З^"Y$"ArrWKYjFN+΋X/&lkfSU*U^Ekr-`Pk28Z xs5[@TV~N?=:ۧ5D3KNcDOnEOF0rrsXíx`K˚gӲ r &W%Im.seb8?sճa}1gtѐ;3gàLe~GdBL2a(~0XW/,(( ()0oκ@`"5wmi{Xּ(_oT,"GOƓf~|_*-d!)dt2׷?[ʭ$ZvˈɣȲ JdX1TV2(9[fy$ly4\w}EWd Hor3fo{K;x[x :TV;qn0F=! V,^]6o۠{5n17a-f 轶[=&$o@LmTOiRm(pA0=GF5չ,$XvY{7hP& ?`~aOe8X++Q?UJ5DX'䬙,]EgZ`idU^ g$TLKr4wH2&EP>QnNՅJ=,45[^kPZ_?.TnqkI\ݘdW߭{#\Ҍ=`AJ0[|W>C bZJOd/ng./+ol3im7=ZTScp2t..f;[]1ŤΪ)Ty+pwuEC WԸp`K4N5s-c дXBpԆd9 ɼ(&]e:#(sN"夼,ZY zB+pw|tŬ~M %O]HL|nVtxr*̛X{N!@RҀ$*%p`>P&D:Hϳ%1Xj."j/5 FĹYhA+p9Np/^0)JoC6w I΀|1^ و9%$"EV3Vޒlҕѐ=!wlP`ǂ ^ ^K -:PȢ8fyUOфq Ɠc'Ж>-q8'5@:=<: .lV*p+&!kg]#[Ln68! h ]TS[m$EnC'9]V*E6^qʲo× VRS1D JL tRQzqvN i8-i6hMKcFK r^Ch$0? 6|J8 RA1Q#R+U{ k|6`[gF炓$;.f r8r,EmeCH ^_eIP;?hŋ)# (-ZGTSI/7C<-XEZ({ܠ6A%ҢƑǶA=k)dJzIqØW-luL MxV b"bXD!EANN,FUX#1`?c NhD!HȅBDONIaՄ:F,V4=_y]bͫ2XrNZ(7Hbzy42ՠ _Aq¼d)|j WAAnk^YmLW: =DQ3E2de Y".(V+R#8ܚ\gTUN{;\[bp S%-uLԓD)ޤ&[%ŏȀoIJCڧO+(0ܳvۘYc%_;Dn ր&<^e[,-H_r:'WU$6Q1$OuXhJ ܳ|ljܛ=-St G tT2ZD ԙH"O3HQ%H$i9ϔ&CYe^Y Aޑd?9bU+sI>O˕Hͻ-o48r:?^}Z=OaZÿCò>-zeҞq iVHcuZ u˺$#Y;J~(HUO%IZ/`=SKPd5SK4^!it]e@- $<2^%k@Ջeχ֔B#jJBx)j!"燺>3ud[yfA UsSs`b͋Ĺɱ\ylyiV)H"#ymA'oMl,o1g>o_lluV=ɓ|iڒ}창ǜK>gWwW>ίg_1xuGw|ԩ4uy|5/֌JFY= ˬRu.{!`\EA-҈S+!Au-6FdmEZ"/-qrG{x:]dk+,z0/* s%pPM!FWM޶n ޺4+Dj1Q7':T~>fPhWs0.OfO_* 3ǃ,^zs0՟m_: ]yƞ+o9j+aֱG/x0q3? O}W bխ*_ ^* pXnpz+w7?mTܫ_ r ӗ^^ ._*`ƒ*6cPeWfH>=%K땯M!棝䡳/ gs< lXI{L4BG7Gج /t XXI;Vdz'aA~q%S%{S/>#$mz,U9"G |CZ`l{}19i5L>}ug.ѓZѭo{U4^p3fp# #%j5#)X`@rܜhfcиwk{JZ]㡣BpYvp+Bnd@ 6kq}h$w&xH^/'fVKUZLVjlU䮈w7ۛſ WN(o[ߜ~vtBiNW>@FM|Z_%}Y7+ 5⛌Įb=fߦ{71Mzb9'c *PP1pQ VhJңJ3Q|!q* g/ 'ZԂvNfˏ~JP,z*Q\y}3l(҂Fտ(,<$J%ڼY߂eC,28YK2rZ핦)&bTAڇ,8RB;spYbSYVTF,>*Q~Ͻӭ.o,/JB4X7`3d]]$#I T,89Zm}kL20 EQh֡љS)8\0B%!j]جhSAV,vߎ AXZ4=򖾇L!_l(NEl\LXя#8Eě6-r(g@ ńjבw)+ nXD!SѤػ&zʹ@Ämo@]7ƨmo(f)6L04qõ1zS2^$ZwWM2Кil/kn!K2)hdiѩf] ߺ'y٫QV?J݌ZMdyo{2:PAB}'5pT7 .36p3.sng 50L+ W͈Ep&b+o_%OSʕP6U"i3UdBFsՎwSoM8ZK5ye- 7&XKV\5%] -OlQzJ.u 4S#$ &?Gs@ QHTUPY7qiUKgm6Ǚ&rl9MΩ,ySC4&KskS"(WR'TӐfhꖽ$!2!2i0*N@?ɵ;xtDٶsbym( Vj\dLEdJWY#gAc S+!(}J͆&$%H@F1.p%6<!T!0Hﵔ1oeq3O4]XiE㫜CDFRIP8tR6a@=*&&OL^q2ӎewvm7%a1Ӆ3T֫ŦM"s̳\34q!n!lÚ3KVٓm\ooNMsbS^)Jsf]ϿwHr8d%N|_2s$f>[I~ŖdbկRzfX_),Qzog{cP/A/0-QV!C8hY,p+U&OBAoƟ Rym.SD͸<˛j׽qɹH0IC#=5h78#FqbDXS֔S.-(WS%& Q" \C:н ~@k< b5 hMHٰP6UCdWa= B@0Xpղq- @pI'tBZtZsI7 k!־02qVh J 5ӌ%:(蠳< 'SBДʏ]~ Â]/'joplFTwoRC3u~Mny=NhuW:D],c %Y@GI!JzJIB([ePB4 oPB^*rWxϠ .8wR"㓻Gro(pY뷆Ra$R)X~yXӇs7VF4wߌhb<5g`&\;:ȉf^MHLe\, 9U1E)^Jfefq -a@TS<_m@XܿƸJfqi Mdx NfqN(YQup{22jq ez.`R.EYaImF):sH*ǽȬ.#bnAUT@b#5@|O-۵s)bх s \Zx !Mmp j,*+:$A%]j'$7'H gR FY @ Ja҆Jۆei WwoXksp$\SJ<5&3ǰz*8Zxx9g@x@ĕ96èUNSUSB`JתPtT6Nļ><$Eʰ-zOSw?TwI @s@T2]Vbǵ@`LqH|'T;ω?=#Vx6I)Х$I<@&G8x ~, 2g@)CM-3Jh)4h޳`uJsD< o8:2-2rEvADV귺ݮ*) -8Q9c~ gdt 0};Ix!bLm x6hmQ5 2u>]_5ҐfQOli *2 Kaf /vݧ?Y6B{:cB62JKRp(`=#YKA%6U ˜JL].'A)KVF2=|zcmg䦿g "WE[Q'].ڲx%uzw5|$ `3Z%~Do'ۧd0ItoQ.ci@w#b$'DJH8[Ak:q=\N\*YϿ?U#Bt9/ΪiOx/-;N}q'X.c9m|JX&zH=0): &3$3$|r͞}cR$Rh͜G.N4-N;:,\3p_$d!wD ^LT|Op콛.1dHqpW `mDt'ڣ(NQm2Se|Dfh;)5`73 Dm×{{JoK|׎ \$Z}$_J`xπ e9oe9Txi?=~F%sEۺl8 zT]3R)Pׁ~1ϻ/_jtCV59ѩs?! T>4 џpIPO#q:5b, ASToT@o"C w=H"d "42jB:q=aĘlDPH/k3 BR}7/>6)iwaOG0n_4 A%|Zzva3vVJ:T|7E Mъ0M9GTD+N聓aĖ .S]+Ż8aRdu#ӘJ|hiQdM}&UH%:hͫoLV6^*"lH;DW&;C9Sga3Ou2;hzGڸH¹̙0[^Na})yN 1 +IEM6x^*Ժz[veu)~> nn`[pn80;sye`G0z(Ѱqn^L~])3{w4L҆5<:c(`Ez^Js^$9M|̋W{)ryoFeQE(+yiFΩA så[Gp78 x N;B!t!+yKn”d(SB76 ~~(A+ eQahU0ڹ}Ѩ(,I4)Cd=͍FEuջӘhtvPܨx4鿧(*}C?: ɚo{Ǿԍ}7oaSKi븣qå[Gp7ނE*u# c>ߝ0eK=؛ZXVBִUI)9I*˩Ki DMlFEBWFUMlL<UXW2" -1e&yc=bKU7\^LkcgS2X |Wn8qCH'>/ J^#/Zsm񇏵62C y(#E)ShØW`': W0b5D-)zDP_촞2ɉڳvU9 Fs]0_GyK;q\^-QVb@t: kM6{K@*׫]!-PՂ%ii\WǵkdhOi4@6N|CJ@rz)TYHҺ˽my~<Ё~7DR"Ľd"hjmV,ikQUxIIQH4C+kEdOJRVB.vf~~]<ـ\“`~L(][+I ~wwXm |?l %J-C{\͘*'&EƲcOϋ50N1~ ON@ɴeyXr>ۣ7i@]qN EjZkuM9au:0kP@2Rr*U ۚ9wZn|pv ņˌ1LRgy)Kģu :_~GIۆ-=, jbnx±M kvu|!;-u|r̽[g0Kr;y&CM!V@S>0|f}oVwE4$"1M(eX$^Z2e-pHz3>X󺳟rHX(dQc$ZXHVsV:e5Hp{)rG[u=}|\13Z͹*(bE *f裠%yr՘碏4n:LwCKb"5iދ}:C \{pu"WnQC_gb@v @I6MP^2y{06Cf^B޵6cٿb`ߏ,ӵUtYz&8ז'X_RCeEINPe7e(JCCij2UޮSx8kKZ)(Q9hNc5%a 8@>"B] #Ty~^P:<"E9Ԡn'G+3͓Vf07%W:*'nMh+asXKMO>Vm5X8n9RՈu!4Z&z!p%O@㣱ϛ.|ek[{`~$ґI ))=%T(V76+mK9Kog/dL4ez߫AiKߢ}M&BuÇ?"55Vٛr-_@Cjh0W:jI^ ώ9jL4iCm;Fr) FI7Flƈhªywmp֪0e|gSR̞]"הvinWudsCEFPd(ج0S 䯔YVOi~/oԣSwlLfT2!=[\,/ӹMGgd[ ֠_65@@(9>u܏KHֶ;2XDhaorlr>UodKv7?PI!G@_ỉdBesE?&ߓ4=j$jjBF4Z|8 Sd7nя W9g$F&rɈ9Wg+?e7Gq#ή`}DVjo*k6LL?dϝv;>iv!p4>|j85v>M:f%"AUΣHcPdTU/-Ay҆ iCce"( iX#bWOPWpGA{= ;U#%Y}8^斗>#!rԯ"+^z+^8w9^m|~*Y$p/ȵ],n48zSXKyAL0[ >zR YLK> *T}Z"|/LJ~I_(w;8T-7_ 7} F(r$u\F @_.ߥyF`Lyc[SYSeRQxj"&B2N!rPJ$G<[C $ 6)l2 vdء'd6YFZCCė@9"}?՛xYS($"V|k<XAt(yi q?" 1e %XH@D(Fa  zAx8c8FGb@:4 ų֠ӱN&7_/: *#ޖg> ׿/ׇVG<,_'o2mq?x#G3?w_j^Y %U7իpI3V6{~Yfym`mj FdO{%p?s gBe7Bysjo!㗟 V/U f2:$e8$*zzQ##X([1@hzrɚ=m=>{Qq*bm:=W؟]].@!yTSR *Wjozj_;CP_A 9a j[WKD;#ȗl,F|1}^ ml p.aJ'ӓWVw0_OC0$q9?k 1t\>/!B , ?t-HS-+"Fpilb Y)k9Wo-K>3:5}x7]Tw=}nYçOU Yj@v)8bG}>&U5g12~*~cAugmw]v2UI,̈́uXN@- `OUJbWȉ?/P'pFNnLe%>uG毑$._qWI1J$Q_YW>_Y31 q' 9R츒9)T>iiib0_.^ӟu6қʩ/'3c#wWo;|)Ġ@ZkQbqipccY֊ZhX4is@^{fU ԯq/K?@Ҍꂳ28ȃ1 BD,TPOHI#EQ#. q#wn~^F+z0tz6̢^ΐ@$u40;V/*k&e@guzC|`9O~8Br!($jiEY+Pˮw5Bj2zAT3A(-0$B0`<€zq*$ PR5[+)u  xarcA F W2tT!jHȠ #-As=1\Ƅ y[yQ|̙$jYN?B^oҢ rMU_ tC6K#"IAށ@] ֎whAc[>SAgEqx3o/3 qř<|3v\(n̵JI^ Uve.SO7Ni\WQ&㹱·j@7[Ek:8XI̐I3N] rLƗ#SZ_sTC{jܯ,=niqdEʘ$ƫ׏-9OFh=`9C5ճ.!Xpٙg ڈ)VfHQvGRv-0R<AĐKHUOCHJ]&Ŵlq CH `}pnK>svyɳ}ۛR@*\VW2笹qSC 7u`eMSȝ]xYrnb3.ѹCŀQaȨ8qFJ7] (끢kNq1iSi[w8}`ytEv.rΛSƸNqj0#+ WD5`v)Pj6Odꨢʺu6wH6Eu%e?:'eǻpUQӤ Ky&˷Mƒp8T* 㣱Lj2"]}96OМDX#ۧěER@ '![\5Ȣ"/%;m/W]HMe\a;nI2 AX1k*ƟiA+:X. 6}̟/`J 6# XSG΢A?O`aWʎZk/fS``ݹ$U ]{T"3Eo sk}9[`a2Hc  ZzEITsEڎ p/F{|Db|S? X(=bz5!&_r2ڦAŐ1T1.½ 2h,Xt׍#p1y`AC#z^y#`)Z|X|4p*v-Up,g/ Rlzl #<̐#ghUKo xCd>:fNĉíen"n^haI~ml@QHWw >GRF]'DсtDG(A5_;kP\*# PR\ ĤV6k{k&E{NPFN%_NϣQi[ܭ1WL gdJΟ1`BKo+ qZ/Gf4NP}:BQKtNTfZR]*f**9JƼh$-r myJ SNa}u R΅DVSXBwX B8g(m.a[V+whw6SF` &[~ڲF9GHqm:iY뽴B!LawЮU7aCXS%ŪMbb BX*Ɍ]rɩjNKpi`x#d}u i7L]՛&1lmDE=ֵ-Fb D+u+G5nHIф^Vy|CI(= R9cGWT59i(^`/yWWRiIY=Ul\1eC-Z,QFf}AT6QhQ8Nwo[2x&4rBgnqq0)N"96u z8B"!GmtѦBEr RcJ!5e kM68l j޷aTmߩ]e HrUb-Z*-Vmp6#3MAC^Bj*Q7^րegpĴ6npԘ!A aZ~G o{8T˭}9bY"ًf].7/g׸?f3iXA 4?g}:EH$7l^U+mH֭Y lr${ ̼Iuo?0_o "Q@DYUގLFO2uSTWC̘^[׵eJ[Y lN(UP֦'ɲnU>DƔe7DC"Nߧd3n*3z6|xm6ۆ8X.o⍶_zᄒPy.:0/jW3v5Fv_l^LoYnњ|jZ= ATIJ`X*-k$k;xԊBVk8[7#) Hs{K1j;!RZ- $=1q[2CtOd9nQi)<}?LoxVZiL<` C"@瞉ҮVR"ヤLnEUQ1!6QsOMJb%~cQ=RBd.4_a K( Ay+ ͼGDof#O#O EH۵p/r0P8b &C%()Nrۼ*yy' n29$(@ƥJQm'8 ikș1Z5ϑw'-3X6+rRD^ I>-ZN=ԯrXWBj;CPq{RΖCm7 1DVtVn91ո\NJio Ě%Enh^;I J Z[_*gk3CR 몰qa!p,t e5TC?*a8 ?ޒ%qIGwqyS仃RqnA{0#V&#?{ DVw"QfHq)1qEJ9@ R@ߪG^ "'qKCR@$B0Nڂ2`ndg|KĐ#Qsǣ$$0$$P<Tj(0BBܧ@FDR)˜N1RUojů1FEIfDҲr $fWqB T! WR)"$ #6Rzl](. N|0'Cp H 2b#qx2i#fz MIgzmfm^IPv1_?*qD\1Th#7'`VMcoF:/`{R:s=TYv wWuW?.B?d lMPkaٱ {ߦ, $CM7e =sƵS)%:wTNsFPj) XWZiԁc}FQ‘bo|VKe]UeqI_PZ_ʮ7G' _W[>9]ڂI9LLXKCs M?~y˶kՌFv~spl8lN7m8RU xj(B/ IU`Q4ABa'U^VjF=撒LpVCDvMpgn];b\sJŌbSV?Ԣ){R\2w IOih0gvGjLm9ĉXZYS(Zo)<}^]f`pP.ĉ6WaCU"9ʉQQ!5jE RѦk0dOy5LڥYS`d3kɳ:7+Zau|,3GCu30 m0rT *9ũj pi 28 @[o9 0+eԚ`tZM_kM >@KI*bVEF,uFdoO>yʽ_o?CWU~Jkϓ "U̪F&p4d,ELªJ;a3ş/`J2w%Zc墄ܜt`4 ᆙھҒ-zP_px;~S^NiyNKM^̂)sYa%FO/MF|WomXE %;)@piv$@"m dքf.'ћ!.S/vsœ>)y@?x>'H(XD AKŅڼ:FUOq\/@rL7C5SΞjM7Òt|"ي3qPv6nPXVXM5G(H9Scf Kmɥ"ݍ$TbCžo µ!r5k:-PPsCŪpx=Yuo0 Ӛ !CD$7wi>DvSB љxK1Z%{;ԥTg)eNSԻ9%v{g!vjv=T*~wkD:Z&"<ܳ7h%JRmXH'=sa cȾlP'ՠ9% /QmM+2&,u ˨pw,1$ܽN ie:K}N+@-uvu3T`AImTtޝ TPYT|t'vS0;kɅrc% C,"#}$a""eS>Tz7 0" <0C @RF!WBALǹ(zSZI-w+4SD d)ӽFR!xJA@*8BFk Uʬ0Ui gz \t8M8_j9ҿ1:'8GiF`_{"^w vs! ,/jueζvS7_e s^ \_ ⣵X 6b Ӫq5؊2'ڙ-thdji6m#0//}#uKThhKVxFpgFȩC 6Bj:pnBjC¦swmJ~ݙcy$vo7:=ti:v`;==8}Iq(QA;bXrZHN#1wIpл@\0N*oVAY:FU08B-9m9pSdI%Bv\E ΩUQQ}^2Gw2HʌBI7KTyK z G+j!똚.GFQW ]=i,Ľ--h!B\8 I>Nq11EiFmzm=%9s&bH/٬M0wF8!|mP>K o4*n­{0Uqc֫H-٩ۊ7oe3U )(8r.-DVIЛMRaKh[^]/5Hf89+H0 AJ̴ShuQ@_Sphft )N4_^NB"á7B)N XNW0D~]BAkhi-Y'1!!Z{LIX\,U"u.,j9CCR (qWyͨ~,~Ɉ c'aXO"#Ў?VU@@F&y&5րSLVz72Q΍{+ 㷡d eGx$8#B8y KUD&ozTԨ?jB*1wub(D<1*PcR @ =S9'ju }I rd u.bƜ l:f?fsDu:K{7#[ݾT:9QK8'~#ZLv{h>uy"]0Č- ͒X8KPe飻͊ ~gRTKI 9ΡO+-7՘n}V꧲qK/5op-Iu pp^ *f :6M1_:ox0@[Off#%}L 9px[ MZ 279 }S@wnV!νQrݍ!rʍƈcJ@eB-|Lg踙%N4X,e`obQ._5Lëͫ F6O2>:KDҿ2I$XD1&Ha8iD"MKNHBdžaݎvq%"8 `LpDD!"B#"#F!, c (! jhCȘB)Q3?f>&/fhzPCbX,y0qJ% NHq(NQ4 (QDM5BO(MH JP.tX#h1tt$ ZW|Ub}dcJ0r%0eh0I!đR`X V#%@mq3v8TNRT8I3iẓzhBzj6 ~7GUT-3,?|z}jT~~n_?Y&G_(\OC8*:~෯7C5ƗjkP} Gُ~&)>3 VW1}f|kɦX!TO |8#XG>/Z=k9U_vǢ5sZeNI (nP/kMjzҡecLwrխo+HO(bPRy(r'i@սFc([}Ft`ę3;vEqY멤(N-[;sE "o-@SsIȹkkН.M Y7&{AM3¦QS$8W{a Cy텊:4Eb̝Ӂؑ.B!vBA 5[0ګUte4qsm,5s L oRs It ;IBÖh,HcyEN*TkdrptR)JTyK z G+j*XԴxvFtA;Msxv8=%9s'a7S8IY=.9:4hMUv6Ƴ*WHlP3/I++)8A)$!H| :!8qIՓ7dZ9Sq lւhh uD.zN %BÖ\` {Qv5<٨\p/*L+VBm7Xc:d +T: 3Czڻ*e;I(uχJǘ詺'QźQnޅ(-GSSwRN>Tƥjn7ô058|{kFYߒ$Xh5|cYl΋{/.f $\&z<.'`Z+Z76jm+-1yo_LS5o>v)q>Ւb'"I]9 Te#5?$nEt=fzb6hFx ߄Wb//E&0%Gx#"nG4"6/[a dHgI10S&^WJg #~7c4HpEvCهW?}Y0S;3ȝ~{`+΍@c3/r'<6" Dkh^VGE5 %c V{:.4%8&#Ow$bÄ$z>ycBɋf8V#\M^cw# Y_TIlsJoϹaD_Y wB/=>hZ*kArwBS i@Ua3aU0{I.qb1;aRK~`y\?̃ߌ;wFbDس 6ē8\CvcV%=,13"#Fzݎ¸Vؚ!U0Rp\r@Gt8i`LjĬaH\poeS:5{MBmkz7MS1ǚe{eF'E{7-3hD`v@SĽ;htQLtVq cIH/A=;5ف:N=F$V?[k& [muii>Fn^7ERIF1@$f02Nkm4H[g;IL^6/#c#ǂ3!R.H'0Xe*$H%N786B&Ʋoxŝ}{t"i$qTR5@@b&/*eS OH<Ss (d)jkciD8Q  XBX8! &4I $6@2A '8:a@&Jǧ!Rj7pTL#ЊכS8~3OYZMYl`Q ZM|b'R'-;9 a F_BR//E>0%Gx#q'(k'WMgnέO| YMME< R%`j{t(jxZj!py ?}lQ􀑝J~{`%:b1!DWu1G(EXII#eie0A.Kei=!@WӤٹ͙ޫ9᭠rUR_Yң Q[҈xgA=XSs2xa$!( ց8LRID8"cq*CD!E4X*CBp"Qկi eR`nQ$ %<b$iĀDFiaHRIS)$9ƐQQ(z-L( Ψk-m ΧoIVB#4޾@?b=ɟ"N\\X:Hn"tNiD RF1%s3$fJHbȂXDQXpf UA.5€`!CP_e3RJ\NAkgm_M-{>s{kࡶYN0XDwll6-H|k㡇2ݶ4!F`8b+]ۜPq<5KˉB(+ȅq|3ڮT}uism&ten:X,FIk䣬=Y^:Ҍ~^64}^|P&?_\=T4~{E8ύNCХZ*M"Ex2/m~Ⱥi=OlI^fvS豆Zq°G2:m7hVebErAN8fE~ķQj8ZʲRK5bԃ;@q1ЖuHJ߶iw9 _qd'͘1Hˋ(پyǓm0K &vFShY7[E_=93N(NigEZ #!)@8-˰j)A`UbB $.s^k<~uR{H u0!Fha5erx6jnKd3]k>gLfP "CnI)g ȰE@%TbQSψWH,#( &h 8pPqk A8b 4+qd5$o: C!.,>di,6ۇmI[,h|klh=Ke¼ŇJ tB(!@ a-JHbpdMCZq9Om6 C2tCY&j[qz#RQ XCL3$g@cETI_e`LeAb✐;z׌'@"O1_k7oȾq#87b O Q1gr2"#D"3D+ͩFc£|x]/3g~g_]苮FA#F.f읣^4MFpB?Y%ߦ L(@+8ZHpepXrq6Bef S>` /4tV} [ޞQ0]-!n2w6r{:8]y\E~乗ϓOIVW| `%;B:'+m;h;Y]2;Gwn|N59\\|p Pv@vK*;y9uzWsZo۽'-[@SDZw$#8,=# |y<|8 >^H1`Iw}bn 6FW⠧t> ϳ:OZ 󨻉Oʀld/lR?JWgjarB!A_ 3\HD:$5AXxr!& C@WمSw VW]鉢H*a/ Alg0s`dc΋^k$p~*?6k`AQ/ѧayA^-yc@wV@K(:']%р;rq2pojs/!f +]/% F^b!7<;.eK jsl{/Ao>x%d <{&Ϫ |X_ xx逇jE ;ې2勻܊Ȫ){6[ܣT]:{-bFJp/gdY 81'.hVy)ZvWNf3*3@-:&^Ou3% mSz%jDzaڡCZ(fEhNiP($ݝQpgeo Ri($U3mpQ5ͨ(T5Fh@bp *󱛺QӋpn #mkZ& .k8֗8.@ nR8I\)VC1@ [|@IF( c)8e:2f]5e!⪉ct ϧ%^>r7bz\xn%v>[|͔ཧf=mŃynuݒV#9=Ղ!@ GIL `H+.b >QZܱ+jb%Q4VH~$Tbc?ǟ:n,ܐ 4FKRG8(8B c H)8#&`<$Yr1V['DƔ_JP |N;H.Annu ST^ܓn }G.BpN-ҭ;1Y2!77-}9ӷ3ѳ-ҺpeN__\>DT]n^} ^ͻKGLQX>yc?}mnݿ7}Jͦ,Mfye2~;G7;!pkegQϵm ,}'+Eju~:J ƒ(+wXO*Q'y݇}Rf}6s.6Hxyl4 q K .98@ R"pERJ\`u28wE҉H9-yWHd~Vm"|lz# uy{ӽ8;3Jh ȣzOFNaq0_FߺXmz=|{!m @BEB" StGj2 n j13 C]u&hqÎûoEd*  9k~ }m|BH3l g(El YbL62Y,qJ`ipO~h09y8`Q7H Ӛ\͍E!'>${+e8z>M 2?5#8U;Qgt'NH` żB{AG-v`ƢYڣKgFA!Wg= ]A [R 3jzs6+3]>0ĸc1V@v) Hq& Ԓ2 ,&q`DZ,aZ,o\Ս]qBfQVEziFje8yd߽mu5YU''']9Cơп)y^Nc1f'7H!!dL@GI` $F'Yu?Wv#k[gat4o/|a^b5ZK!{a1fU(0⒔+>"g/n3NFvTe1v-Egٹ,ËE[V儔 m CJ̰$Vkq-X''TCC (F U.Lžx \b6!ˍV I U0Z4jNK$V:FhFr8'*ݖPWi?OD_(~K̆ 8 j{~Ђy~*U ]rYPxP4 tq}b,н#Tzxz 8k+U+w+uƲEЗFwZzCn4&iqO<=IJKi\tHTyyyyIkNuk Wl^,KJf55WJ% udrX5>W/I(-TI# nmK?m%886r Ovjˆ[*߾߶޶׎ۣ7[iw=T UWܤJ@nRcFK:8k'v_+!5Z4q A/O Ba&Cms*~T<~T< lMshyT;G5VʕQyP8Иgx6%҆_-PCa -`@jj| &qM}TW ڠ3֦r\Yq\(Kfh>zK:yxިME3> c$8PO0TIn=f[͍r>|eb1z*1`#iQ3{pZ3F$ʤVjVcGG4EiM)dJ7g:ޓ.T g+f*e>z9Uzcp܊CN;^8Ȥ-Q,#IV0P$ȀT " McP:2Zć2yaf7 =T [CMdDZkHl/{ w3ܬ.[Ͳf%\]ӧG$<|եvV~ᎉMvoU;ayL6>_,1qpY'$w|T {^l~] p4;X]]渒:,j+Ed;)J jX@$DeX)I$2,S/Qk.PE<&RwQ}:4!ygA5ۖzx8DPT/O?$lOI6fh. b:@j%EߧhƔ)U)9m),J]JmBhIaZ qEٵKeVB.H0bgg¾HǠPo%ҝm L@j֑!~g4 5;ׂ 8%E- 5R(+XX daT%:ئ\q2BaۼݒM3(-3{˼mK!n _Oy ޝו:Y;b, i+gΕ9xYEG-+K+tV "RȜ6?<0%KlWhxc1FƨQƨ1J cEo-\jXMn(X3.`t6,at=J NnkTs#4 o138A@K(r2r vICV.,qPN!zK. XN\d=XN.g\4Ƽq R0 JtoSp6ēvT whD/5C4J"V߅)ECLؙ!~|vRW-:-?: IMY̢T䯓٪*C |\FcLsɞL$ +J ϯN`ԁL{O>}TY,F/"uT&S+JA.S:惂TƇ 'HΌʚ##&8A VVXV҄Bie҈TrejψBҼ '(,{fB,Syv#CU{:ZeSrEs2M㦄viUk\X^oo J\"gzZ]=(BnrSeLAHJ]h;`ta{`u!<#V8.X fpnݖuה⍲AE]H:O c!QEXͶ!i>9Yo>Ub2J gVqbhZNw;cg`8 Ec!0B<88N#KLzS!-L 9Eݾ e- R, 6 iFΒ8g9NIE0ͣ,L} aD$ˡ#8EM8P +ӗJ}NכEQ_ v_AۖceZ>}& ei0Hs,8EĘP D1JDGęF$"b ;2Hƨ YKu^(ԤCW7~Tpl}=WgoΜAޮpѧPE?.R+AvItA6tܫ\hl@snlVB~ x xZ qN@h۷X;mD_bsbuSEeyio=]/̻=]pMHE9Oo2 rw _mm3mp<\^&Z0 |rWh<(ʃZ\s m:9me7зy{Q9lx:KQA \wMW&*TU#ĠVNQW&ǣ⁦ǣz 1ƨ֦3<)$cژ\m.q\d`Jly9KY=]KtnQ(&joG-6,\AR\Ӭԝ~a:Ճ>A-GAB7$ f IГNҤ ,Auw)$\؁>S0$F v]va}M;cxw ǘ߳Nq¶`ڝCx?[Y_\ܗ!&8@mDe&/ZܒZ[lv83KbX,vPP\ ״;% lAbpDU4[./Vu|Gk\rںZևύbϊwlU5-ȭ rz嶬aAYsFyɌŹp5eLg/eZ_d9 &zݮ_2V&w}e\@y&eC;x_ cc'~$r0'Ǝ}RG毅g3Dn-WY7k1sh8 on$8>`:EkϏJ[|4QFBr MђwC!F*WYK>^LItd-nIS:o`rnD|g+F:ғvZf{YM_`.l:[ps{s ou4֩!b'bݐlr~G&."c6}4@M+h#vP=;BqJ^"M&ʸ5)FaJIzY OtP]E햦0ֆs9|lҳ1=:N I 5wH0}o$hUfd</8YqxL!uC=$K2JW7IǼ u+:n)]SR.EKp&IH%K/’'MhsHࡷI;sqp S&"OL`uOnFDUw|Zwx}gXxC-ܲ%]~hGwOP-6y #WT9cތc/.nͨ kJXBvǰ[ :Fs #D8(:B.X!;;)aݺ؟#/pt*Yz iR%nG.ދr[oE]еjGPWTsg2tK =Rvؠ("xPƅUnr2g [XTAXET=5+!? hF)"~^:T{ dLkn^3\.ӿov`v`v`vWpڋc&"}$hrDAFXFaJN4L2 Eyli߶QeHz,m'kaḥ)sei0hW&ڭƁMȀMaqӖf})Nѹr: k%13\r4Ch't6Od"̑o Nӷ6o?E۶&yh+Uډ, F11twƖ.lȖ$hB&NYBi. Ǥ9 TAM-"XKS;rm_];bsJr wEO\WQ@%֬::]4=2܂uW8!D.yܲ8 ,xTROW(f}BEEԚI=H& CXԚ RCђpKxxaxmbj(%7{Xf&q݊]vÉvBֹ4r=p!҇Rvy޺'tM8{IrF\}e9ȕ4"Ш'5~QpdZh43TɚMؽ9RBa}n$x۔07hQiχO6 6A< xd1F:.hXi2)M0QvQiɧoy>0%|ґ%Mʺ Y#Q'%x`;ϵzVnwk8F&;lM;;S[lQgP~Myʾm8VZ񋶈Ͽ2/|\FsOgg{t d7&Lgצכ aϹ2 30>T"(r &OEO#0wzpW:Q(ϧ Lrݫj'o 07si6_^SnaBo Nӷ6o?E۶yA3zE=ݨJ.n=)bZfZ G,vBhTߌV?KLo=&)W8UL땸\:Qrb1NѹEjFuvYϣ=vpn!oϟ/0}-MC㩝/M:(0 M'ݙY_vL??G0qF׼|CFS]( ɹ5% HX hI=^(Z"'*2,%zGgxM; I4Y^qUI$]zmm nl3܅ sƌB^vךM=^\+Vfdwl7vBc_XS swMsj\@{wa"ڢU>1]o:]Y3aQ/GО5 3/H)17?Nt*Vh:|?[8촤b_{gd@cG?z\#B323'[l3?_PSPoő4{QH*{Qe/)L#Ei"zGޱOs\rhl^tH}πd> P)3%껾tHF8 no&/!?Lej4*{Hc@yiB>oL/z׍FFFF~@<N{0HbTP* b Pq,PHT`K`(?Lk3߁h/6f5/=p}ѻy@Iin3X]܍Nkl%Cc +[ ~ybvA1x$ճ%0j zT06zhn3T0{Gb=*Jz"NHĉ#8Ae8<(5 rhl1S g#%TAmF kDf+lr)!,"_r%?i6$Q Q8 C02ICR"@)a z4Ŝi0^1SǐQ8" R*IJ M@A(8i,<%." 54\ *&LI)$!Œ+B#H0QHJ7 3=b 5fa0x kYtfsUeijaoU+?f|.\@=_uq)5D0#ac"q1dC"zno$!AEECZrj6;<0/ ĥ3hPKltcoC D2/w3FTﵯ(5 eC%eTW=N5(ζ\zKH}NF2X P,?,ART1 SJ#@!cĀ$b.b =#đ\1l%AkHRhx Q0lx 2 إXDFh4B+ QqE-e5=6Q(\4tJ镌<g/\ZTs1%/_fVwFAhO6b"X԰iNcb/u!ΣQ XġB) \bZy!cXTV~!,+"PB}ٝnxgKCW8W`eͨ۵ڸrsȚd`[,X7 tQN*9T7hRcדOjxI1߽~Oۓn˧Q'EPl{W"R ռ;ؖRcQ|oڑWD+t&*d5DDn#)$&?hbXV軶KYΔҔ\3džLjYqy 0b1[2` qK;yp* .1^ "{ sEׅD*|Fׁ^1V/v`a'ú"n[\ J :_Zh~>O~uI*OC¿  EPy(ִ qC|Ш~jPݑs"a~ lh0Ir pCvw!(/Zoտ,;(,{fa3];OKPFA%V8jX@HN<і;K/3  #:y#cd2j&U+mmkuT-'UD3ٖVDm9 ZyV]bTm Zh9g- &ژpj(3;AuS+1(Fˈ+PӘP:tPX p@؂)ÇLę{rs%ˋჵPdJ7vAuVH1޲RC|^Ȫ#Ve=f1 -*o1tKKZU4yOm>C=-74@0,h櫿EGȩ`xC-=Q!YU#۽fQT 'rR4H\0Z L=Lnc)R[{|ֱg@([a6 2JH7"1<0l \]e-0D-TK}G\6%\$k׽ޛ+saS|̐5jG0Lo=ue"s&l?ȎCx,$^ ׉cn։n?VG6~Y%לW]iҽ }i\E'~_w;^*vk0 |u> Fr6{qmo4O2ώ~?ã$祓 )m^{[Sw~:I G+?j X%߹t0`eM?$?zWmXv~Z`wTw{[(]Ƚ:ɫr+ɇó䗯HiGn|ׇgy:~aw?O=_A_.|mRFx.l~ۿz2L6` ,#}~? &_jL1^}o^oe684߿,hp=L„4_~42 WChw3( P$[˲糳eMT )O-XJ`hӄ{K 2ߴvzb]e@2{+N4c2|}p ;;UhðA<9SsuEW/G(Cd1XQ5 {;2e7+صvHֽ>N /t' Ti4JlwMց58vw[8qnԢJVݺIRڹ;DH;D "Y\jKOZ6EL!e4 ""]q(%"xt@Ўt8(dLrEoY2@˹3K`<]mt8NTNtA3`ۓi]&4DϾ?&ǯ@O͸hdκWNFDntgGtGIF鮲7޹:@1cPHΰ԰'\B3 2܃܇`.VA-Ӯ m'ޝNѻS 8{.F, )˻|!Qr6 a)'2 1ǑքY0QFD\A, #)C  /PJAKn (G85CRs.)M@adDf+p6E 8gZc "דuw?M3#_N @%I*HBK;U k|uH&E+׌ll@}i%jnil,b" Z.&s&-A5uն0jcuh. Q|'LN"1@BCzܬ:yh]+4ggG+06^f>G?K%(/[گDƬkMmX6f\υvg:$T1QuDZ̀n"$"lB-ox\\y'%\nQ_01VQ:ٌUJJ{}brGPp0sK[]a%)(4옗M]M7G1M6/q)EMYZ[mJ&I:Z?n5W5q+PYB&n]e"xC [{ zڶq_6SϷ4)G5%d7E<.9J&xi2'bR\99B\hkR,Õ E7%{ΛgqMk^.F?,5G]Fbj6w_̝|.ʪZnh?U׻-ksa`AzKZR[N'zpj8R݅!v_'EHܻ.I2>519ͺd /؉`njy~6=C¸y9lr=+&{ЀƼ@Xb& aɄbB[h䫛BLX 0!ơlczj$^X`*Wjf{:="bcӑ@\1qn.Rk3.1 r"izc ɹdӳ5{9a}sb~*s(g:hٽ?&{&S}ex5i6-~aN'?(Y&oO. *cٲ8=Ky}x:_)ۧO{F*'\!Jm[TA>;ʌquFV%Eg +]Z<2(XʒRcISܥ:TM ]98&KR_IAF~7h8U ?9cڂF2~vl96Mu.lVpJV\Is½+D`,9Jvs8x<]vМO19` +(-"Dɀ|eѥN ӂ.Qe\|) 9F`bbLeNYwb4 1chg$̈́yAѪSxxe݁13BcqIp`\=]d<4?Wx'x4 TaDlh B-sϥ С:qf4jѡ4CfH+ܻq C_ct;T t9k ѡաLa7#La2HN)\3 ϧ ,ՎnuuBDcqme\k^ $&Ts- څP$-0J)ʬ5Dֹ`@eV3Ңt+ԀP,-KRkwύ@EVJUҌ2YZ+3i]aLȵ\XKl `E"7@.TFm=}}1`7'k!s/V=Y'"]]?~GrP |~W߯?gqsrq{,:w{7Z9g>{/V9ȩ"{2ŌR(6/~; DEnRM bٻ|"t<8{Mi pSo~JMM؋1dJxZS7NRkB[3QM1Wd̕RI,%&OfԟjX}:Gs&T" %%V9si0,PfZ,PZs3dZj*SpqT2\%)LƤ4Li\3).\TΊT hJ @ ÓX&ݹrKDV();wV@Sfݶ{8=W'kɟS n#1x)8DNJ|M5]Iz1PkFa*c ?ש]7FBsx %L'YAŦYjUΒw9[Fm2A9e( <]tAt a3@JDIw_翄{s5 $^-]w w>)c#=Ty Nl> #WFPX14 S=JBjEN/`/m*led߽1Z߅}c2v.>sy)Gc?F)P&ޗEo5uy-|EsZqrE#тqtaeFw#ryb[|\oqa{U78:]8Q|wҥK[P <] Tϑ{1˫<شoa&QgU j^&ܧ>)~-{GEZ~6di!9PZo$oi7h;9=ĴBpLE$Kdx7v}I5htqO| Ll< [MUcEpJ1,z «=vK/g 9f^sV)?*En+/Z7!^y!NQsWy.GcJj@E_>T|jy^,i681f jT1/ =]gm:b@GZBL7 Bkw)K6%1;zFrܣ&1Ն" oyT2qn6hC7D "c:l8:3%'q]KR_ImjuFs.N͆m;)0F%O{mF!LmzTO:sSFl%`+A ۝R#J\᧋gG'"825P#HXe@6kTțR;ӗyeq ØkB`w.}Ɯx;W L9 5\aE+)B<*a[ :%K[bgƪW8*1Osp7x AMssV)Q@[CU7U2!\%$J*W%Z"M6gh@L_>P(j:q^Sq'R%v98Nq"1pxL6G@ҭl^@|C-*ZJ/PŹiߨl\^}TǠ-&^ $^lvB6lixƠvsbRHuU'KH=#@HVb}i1ݦMO&MfE+g&ox_7ߺm~L2n?I^kq{9}/ucq0kQ)2gy"ˤ̅ENUi$isͬ.Rv_0Ϗ<ؿ?,CO2cd}HO g~^<9i}ˀ{.,nK ڣ_{~=u^"}t}}^,|1. nܢ j :vi8[)XE?S)K&4U t*8 ASԥRXPহEIv^ 48oPN߮ӿ~6 Q^LR!Dc9 U)A2".MҩMeYI,pEI^VЛz]Ys#7+ >#4H\vL8v -u5)dz}E$UXf[m'SMY`A*E\YmIe͸ iO09Zt&!q߅zV(0KIUtt{V2gwS~}iX)=j-çOZғ>\~|TNJcߏyTtwToW%t'C|0= %9+uLH^ґ SiAa\ݟ** ,x'J3C@Nn5UV7!hhX_wGJnƃ>vP+2V݆AN8LǏ%:E)HFB1:F\ٓrD|r nC_Ř:tt $LSD_0,GOt_a(P΍+:yD"2&HV&Ȍ,YH.6'HrftSQ.\@ ETF20^y B#H4 IAn%Az؉Yh8KTUF&'ge6!y7dzGUP%'Txߑ:A!cGR'r:S!@ ]ʕ-!FR6zR;k/ZWE*d8\/qeЀsnMauj1p{хUm(f%ZSg%T"0KniF=2a#@M1n~w+1^1fr0&"5xw"/6E޽  Ub+1UypmZиeڠAPJt jPg#z=d2zQDψGÒ0(go%jZ=d=ZI!ٮUSs. R9%tEIt4M.^\Gœ~0#puf"T=yŚUX 2j;߾ܾoezh ߸aڶ_eb(] Kt|%+6äkJJZ77T`Nq4Fnظ,6Zć9yk`6QxJ5BF0Il]F6mS{PZT4ng޵ bC&kgԤj{zãiܣslEܳSMle16x3z)^R=9:;/dzTB-e8|E[ eh!,M ='xyV6Y9<[<\'$ŒJ' 0i<JYELC21ː1MZYP^dּPu'sՋF4,,؄+.$&A.O69o>I0"|AuJ. 8o b0FLeJ{w+c^ThG`d U ;XX)>D 2)ʠeIcftĜNK,  @؊u\j׳|7^fғw9dǘo$gώIqa7S6+Bݰ '2@Q l`KR'(&2$y4 L /F=c!$ZfQ)k,(C N`6L@gUsE܅ԡy-30( )֘~Q- )kna8 5Y(&B BWRd3Ԩ-lsK2i ī㑓aBKhgukǔ%jDQ ]f!mcR@G#YG_n)譸S#2YJrZqg%e!uh6ܩH^sa$eȜX5*zu{BC3N"M[>g_'wyw{qѰ%"kkw3MDi>OՉP!'\.U\\SLuI&=0 ~|z7ռI+OOKt$UVE2VϤSi>|{}-O<Eq,L1:Im,Z]2Ʌ`;=c%b Ӌw~ {dl ֗W^*ruA% AA1{y4_բOc]ғzTv&6PD0l fr}մa" ׸YmVy&/\moN #T/ӛ-1y}F _Fy>\۩&7FsKvaf}i}&/rz~@#{BB_/O~N8SVk sDߒ>kgJh0?ߦ0fQe=Pa4m39fxyz5d>_.=}fLgBnAo7bNybZ?=8x 'Qa良6o'C|mpFJ?6G>ϟ=u]-۫:{+%7kR=J+}8%쓆&yMt{ӓވ#(!T~E*r7/7>_wΪ յ&5m`;.rUF'ge6(4XP%Cd҄d D$Qu &2]SW2-ЅZ*]0df`.A-7MҙO: fRzR|%X ('赑#ؘH( heufֲfYw5[{+h(!Ъ‚XVx.z>x9ܕbG>@a sW:wf@n)]K;2ԖݹNB p @.s ,$!2kysZIX,EGFpi'!! zzH ݟ;K,ps)}T:Ռߟc>&ͪ}fw\v9ifgثOeޗόRVvyMh9 HNw Pv!pݚKPսxX;ҖêHL]WnE]ζJyϺ r֝YyY[I]4XSD,u1٢0/]6:imδB,Kd;KI}v@B[|),*]+'?+ɽq+R{Yi7kyZ-Z>pׅgXR|~?WǷ͛y7laՋx]ʼ% &|K]fA >XZMC<1.(v%,uxc݅uE3iuh6q[|A SЉLQHilA(()tNgt1'36vRpg7x`"qxsD ~c B!+^n`+px+>mX|Hp?πء! Xʷo@Z2ѫܘ -G ,)[G?(ܮ<:=+lvF7Ongjfz{$7 T&Qս|&?_n닒UcWLP ;oҰc8ۃA=n;! (W>{5e =K E\ up.f=XKP;k)A:5)I i .86C>M>9ClPAB*P&ƪlL}2ĞTgEBO'd]Qf{/gŔIL SYa9`8R)mTg.`P%t &LJM؜KZj9' X`Xmx0 #K`Ȩ;VmaױYKt}}p&h{nzj M&R0X-ah`ؾ <$nK9:A?8;ˠ@0?̵-\~ @ʅ~c@ ̂HYbXrTv"}5s Bm>c $5aS%b+F :\$Z (ΙLxƩ`c_[]xjg㏇4YN 7Zj0)%)ܗAMf24}j} 5A%<˕BM ShR#IV |= ǬISwm1p%kJ3`s0,_!L)q+ٸHACmY&&º\UAo $5bX uVʏC$& l \5 &MղCl`DrĈ L(v.@.bJ(%wWY4Q9)외¹5``4BTC8i̡"x+RY=펧1'aKN!FNETЧ\KgZuD.k҉/Ɵp*D#FR 1#mos0O#u"ǀ+ٛD+M@"h;W^k؉`G!lS8C{oH63ć zllMq~\ 0 X{Ε^3 !$q~*y>Dzf_Xaު[j~*҄wJrі||;Sc$ښD ZwO.0me?5T` lFgooi>wCӃo܊NL 1\ӿ:TOg5ΔExVUK}ў0WጢwrzP97Z_ox:^%*KhrqJSY#@eer&עTt8R;[JƗ}k9(3;0xzf:QMvQvl^ NjB;Zz};hDtGiU!=lM M4kݶѵbQAvm+[1fwV~/?k| [?j<-pg #^?[58t^|Z*dǙZe_5SvsmJ^R⢗dϊlPT7ÎY w$vnàzڕAnX<$'^x) "?CQ6'C6;YufEJZ=3(w^>Y<`hsÜս= g{c<ھ}V܏]b~yZa*^zו`yݱQ?ڻ3vk3'}E{! YPl 5rfqh:/w/ץwL(Ykܤ(4kоqf(h'<8ѳvV,eG0hHn 7/9p ~u!9rIk|;_~mctT>y7>s &B JEl V5S.3 \'G>KMx]zMbиhU "G kzgݫRu3°&Y:fUQnG#TI3QR18Tۺ^\*_kҬs,{u{&[D*qZW~#[ /T3Y^]yvz2.'ZX9݄~gm-~=H`{=KllwDM*/̰|T:A~+t-Nj"Ft|~|t|V nZ#jwmu{R1rXwQ2k`iFkӻ.5]{Q*^6_TTgԙlv>:WC/ ~{ehT0Zs} e{?ƙLjo= ;eو;NdmL ܦ[ ja24i64V4}7s{x=8΁?w{oF\ x虖i̧H^{=mƙ,zD8VD'SRogx.@NT5?{c- _HhNx۹ގ~*2=ku=ֆ ('KߞY s)5yZ=@&u7f0۰;w!Mb{E;4Ózt}OFCeODŽ%ގް7,Z1o kFY`V/C;כbx ^8쟭Jt}+j| ,G㵁4)2zy&DQ8AiOSǻl2nzY>G 0mOdz? OrX{\Sv6ֱe旇dq|I}i "D4㷎kX`[c/ ڪu_ l+YGG&0}70:FxuGo+!XcMS'h;Yn)ρ(Ö}h+[ގ<;76_OQ!l]őޓ6$+^ {dGzwˀ]`gf_ "Of$QMRswG)DRTH5DjWƑaYb8/fߝe- Evm[w\;̀_e ^|q^O0$ Wt(Q ޕrFT%`f7<&w辌?#`@H#$hbKED_izy!|JORRӧj4Qٞ7; _ صBNꃻ8`gk'hPԨ!J)WrIm:Q_U$J%F *:2Ew2)5QyƔMg\ Mjl%*%"F x#t(H ~rD(s ӱ6C>J!H^du;" d~n⼢=Vi( &Z-##TM;+9xgղfr{ϵ!zkh^Iz/V?h>[ݩ%9߽Bo>4g%ɹڇoҏi cBn/EEeۢhҢ9:69cqGmg^s{2D$}ߊn{_9WWa95co `^g/ɛ9 O')%}~l}dJm׮5k`8#JKgrqpOkf;/d"&Ӂ;wJF[e';%p; qѝ_ziN#T{iϹNI%ewzugs|cvoL@w~vmLB!JJw`C^ [S- /# vIӵ47Qd5`60I%\z&?zGf ϱDv"2Thd gB''uKE]$- bJ` :\!m2 %+Xp׳_X-GvJtld9 pworjNs\JF})U`e@9s8/I#%4!._ֈQPˍ/ǝI86;b䔫~>^/8Bb'une,E H XBO /P:`Sv I壩Cq0 Jߩ4$YD;H! EN 2b=|-6T#+Wz_WN/ )ЃrcxBwnPW^. ?r6Mk?'nQ/檆ui6V68@n|s2K@?~qvf뫻9 r@\̚X2w(o *k718#53V{p:੉P:i!VK&)Jjj޷!ϓW\]wE~7?\YW(k#NS5mwټIJ5+,/܅PAQ퐎dsENFz_%cjx{{/` 9M1 Xogp=)cE(jb̩آ(CZN'H#,K[[:v˛نR9Q5?hOzJӎiSc>D+h} Mk0wlnDx YRojl$ \ę1j$uMJq'ȅ[nFmLKMY^c޺ؚ׆𪣤ʘ#T^hO0![ǩW%IbG$ !c"Ā1&  4i4!:@)2 󔇑JbA DZCJ^JZ Bº4UڳRkJ 3fWu.F6 w"3!):05p魱b 2)]u'#m(/k+'~tc!f;[+r98zGߖ.ǥ.@`ח竃r^sq_p}+lnT"LP ΕZ?td:[0uF3wn/B=P gzkj|aPHY8?/(;X+~= M%YY (W;_-^UXk7g5f$u1;3ŽN_ؙ x^ewTw6A8ۂg, Q`q' XQw)ĠUn0A+^i-T=a;98Qs ~yj0y{3O}z>9ҨT='kwTXe^W3W\_Ye#'ha8l>nMu(fd-Z{9&RJJfV(옹t &3`=$oی9LdVfɷW??h{\fm~75<~͸;Ѳ.2z\\F'#ZqTdݳ^ ޹ w'[չjܵ)q4孉7hDaiH^H>8}\Zs}. \\ZI44аꀦ4$Llh6PZil`s-65 }C62vTn]=Ԍ0>۷80d\&̛_ՒRɀtN>haGMge~u$B%/?Faȗ8%`+']+I$O.mdJR{Kڭ% D;iy_BPZWKj$O.dJ{ȫM^nmiP":mD0Bi\ ['ѭe)nj"͏2Ǜh<ܟ'_;jXUy톋W;f0\{ yRXGȀkTWU /qo_o'_'sa\V;hUZ-_=%&}ƧV²]=%.fsE+}Hl*[(%grgP.Q.}֤zAm#gs} 0?KctB{arF"PULY{0Fvذ[c ZN p=_TyNh!fV$AZb4!z;PWW8-Y}']nG3U&Lx,!tԗP+V;8I1J(<9Cm!T}TPQk{R]8 ?VRe58o')=n)R*aY;tj1A~]{]UO7'9 !PBE/u!F ?Ad1Vdg-}'s^hn4w'}䚻6?7U4@RQ?HK;dno=a+@bC2؉rNkD4&FʱT/Crk,GrB ?UuH )q^}##ĵH&@%Q_Y-Wv}F''B]]q:11!`"4d( :kVZ4eHS]) = e5ӈÌI)Ca[g驭h٧/a-Is /?62I {sRafBŀp& 76^OG%^< JLF!(Y`@I"g4p#HaڭI!W$\9HQijc :BI/qx :ʠ?GfG(1/ծN wCYFDdV*I>$*^{@̑}kA7' D:քtAݱOL}YXD `%(#oQpI3gh $ЗFPe8cB%5 N@ RHuC8r C2KxxiR#q1`1\pP H  =oۮkP~˭Wլ0ȲR֋IaI34ש|RG-OSw,PꭉJ/ Lt]!5zuZJJ+ͫͬ6rH}f?:{jRJײZ.9z=^NS=$~kAgdaԯhI/))/''J]>oS BM,kN/SVZSnh C\Ypρ@,=ZQ!{n= `NWD/8}Dh8]eRct;eqWʯC94o5^cύQf$M.w^FS4'b-0T o M?(ZoKi?K! ͝Q)Ć?E4je Q!$oGu?V%' Ȝ`#M}4q/n1U}-2EO qw.xvv6qoPJWҜq|C%|u492g .BZOD21*#dXs!$ac}>c{ #OMc,lyq3J+P,ĕ|Z[,KCNNfY$]=?<6pOZ VT=Cq&7 .Ж Hل _E2?tߓe3 hywtyAHWsiWc$Χ{CI(I(I(I.pTR"RJU<-H4qr35NJsʣWZ䷋KO|cG Ewꙃe=?(+M҇cWmGN]|p@z\V9ՁJAظR(( %/ Qo@TQQ5PF A<}s Tu 8z]h om3y4eh5 U&jє} 'liQZBK DzʃDNX ,U@|txPa31C%hϺ·F_;R]2$zM%eҢ!Nb\t@nyp˫uʯ~S x2!L*)8_Юded"J]4HC$1*sEm(:FAqpOsbXCt>) MIq6d2 "pa9q0z5=yJSNΉCߒO bD:;'A'ntʱ7ɉ=iFuE=|`U y[V߼RG ?TxĨc$|l9kY|՘rrʗ{2W/]VN<|I>MشIN y&|(S-NnuvOwyy^3bW+#s?ѽߗHN| |xZwYehOg5Ǐ^,[qYt\\_[}7lr o'zq`%/RpZIxzJ1GPWܼzwAr";G#_'{Q|p܂aٺDmH}هBZFIGC\lX1pv|vp?{3]p[}vv@hdQILj|RF RFo'~*LC]Qvu0rq:Бr][NuP҄XtF0>߿Ģyf]S$TԐPQMV-S+Iߧ-\I9ݲ+!"p;-||v3C>0w%ێ߾<(c$a?H Oc3xdm>hVVLSɞ}sd7es xOflϗ^ɹ٫3+V&&h4 B-W!܊'C!-(!HA4gtɸKi#Ku#e~;+>"gôP?O>+ +aUrυUmw5dL0Tt~5ޏr}z'$fQ(5´-j9xC(_FF/S@ Inh5p>:;rMq9ЙS# lb& =[Tf/D3<~(JKYG6Y )z}:RZ;3^O.尳Ll=jؼ2 _.+ͷ,7=pfh޻Klv{q2l,Y禽dtnQ.{h=.;/7#蔑R)U^MLfgd< `Q+mF,H- ` juǀc@*x :TH֬؝jCVq؀j#&*֨b6-9hRr'Tj3Z/46 y"FL-5v:yM>%[[ rD3h{ٺi?v,8zș"/g,CqY=X׼\h۔/!3B+fs=TNZfv.ޖyxtW ROk]dZW*ךQ^oۋqnsj":SՌLy7v`zm}-B;, 4[$: űHo^SCv\;f[ movhQ;oZ%ݝjD j@lJ/Àu!j 3 U;}$d[ r2I"ÒGW\ 2% N w]X};lvsq~k^6phdӏ˫ _qq׾1R9 q~T%u,J׺g`FȞb̓h/k+(X1*5N)j6b}1/x&/x%7F [*Oߥ?!މvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004160434615136005071017703 0ustar rootrootJan 27 00:07:00 crc systemd[1]: Starting Kubernetes Kubelet... Jan 27 00:07:00 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 00:07:01 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 27 00:07:02 crc kubenswrapper[4774]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.103461 4774 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108783 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108815 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108825 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108836 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108845 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108855 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108869 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108904 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108913 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108922 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108930 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108938 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108946 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108954 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108961 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108969 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108977 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.108985 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109024 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109035 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109045 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109053 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109062 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109071 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109080 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109089 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109096 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109106 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109114 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109124 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109136 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109147 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109157 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109165 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109173 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109181 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109189 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109197 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109206 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109214 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109225 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109236 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109245 4774 feature_gate.go:330] unrecognized feature gate: Example Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109253 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109261 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109269 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109277 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109284 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109292 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109299 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109307 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109316 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109325 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109332 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109340 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109347 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109355 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109365 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109373 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109380 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109387 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109395 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109403 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109411 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109418 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109426 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109435 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109443 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109451 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109459 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.109466 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109607 4774 flags.go:64] FLAG: --address="0.0.0.0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109623 4774 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109638 4774 flags.go:64] FLAG: --anonymous-auth="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109650 4774 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109662 4774 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109672 4774 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109685 4774 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109696 4774 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109706 4774 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109715 4774 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109725 4774 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109735 4774 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109745 4774 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109754 4774 flags.go:64] FLAG: --cgroup-root="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109762 4774 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109771 4774 flags.go:64] FLAG: --client-ca-file="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109780 4774 flags.go:64] FLAG: --cloud-config="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109789 4774 flags.go:64] FLAG: --cloud-provider="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109798 4774 flags.go:64] FLAG: --cluster-dns="[]" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109808 4774 flags.go:64] FLAG: --cluster-domain="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109817 4774 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109826 4774 flags.go:64] FLAG: --config-dir="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109835 4774 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109845 4774 flags.go:64] FLAG: --container-log-max-files="5" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109862 4774 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109871 4774 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109912 4774 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109922 4774 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109932 4774 flags.go:64] FLAG: --contention-profiling="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109964 4774 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109974 4774 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109985 4774 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.109994 4774 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110005 4774 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110014 4774 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110023 4774 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110032 4774 flags.go:64] FLAG: --enable-load-reader="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110041 4774 flags.go:64] FLAG: --enable-server="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110050 4774 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110062 4774 flags.go:64] FLAG: --event-burst="100" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110071 4774 flags.go:64] FLAG: --event-qps="50" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110080 4774 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110090 4774 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110099 4774 flags.go:64] FLAG: --eviction-hard="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110110 4774 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110119 4774 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110128 4774 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110137 4774 flags.go:64] FLAG: --eviction-soft="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110146 4774 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110155 4774 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110164 4774 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110173 4774 flags.go:64] FLAG: --experimental-mounter-path="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110182 4774 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110191 4774 flags.go:64] FLAG: --fail-swap-on="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110200 4774 flags.go:64] FLAG: --feature-gates="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110210 4774 flags.go:64] FLAG: --file-check-frequency="20s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110220 4774 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110229 4774 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110239 4774 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110249 4774 flags.go:64] FLAG: --healthz-port="10248" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110258 4774 flags.go:64] FLAG: --help="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110268 4774 flags.go:64] FLAG: --hostname-override="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110277 4774 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110286 4774 flags.go:64] FLAG: --http-check-frequency="20s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110295 4774 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110304 4774 flags.go:64] FLAG: --image-credential-provider-config="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110313 4774 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110323 4774 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110332 4774 flags.go:64] FLAG: --image-service-endpoint="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110340 4774 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110351 4774 flags.go:64] FLAG: --kube-api-burst="100" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110360 4774 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110370 4774 flags.go:64] FLAG: --kube-api-qps="50" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110378 4774 flags.go:64] FLAG: --kube-reserved="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110387 4774 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110396 4774 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110405 4774 flags.go:64] FLAG: --kubelet-cgroups="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110414 4774 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110422 4774 flags.go:64] FLAG: --lock-file="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110431 4774 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110440 4774 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110449 4774 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110462 4774 flags.go:64] FLAG: --log-json-split-stream="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110471 4774 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110480 4774 flags.go:64] FLAG: --log-text-split-stream="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110489 4774 flags.go:64] FLAG: --logging-format="text" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110498 4774 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110508 4774 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110516 4774 flags.go:64] FLAG: --manifest-url="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110525 4774 flags.go:64] FLAG: --manifest-url-header="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110536 4774 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110546 4774 flags.go:64] FLAG: --max-open-files="1000000" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110557 4774 flags.go:64] FLAG: --max-pods="110" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110566 4774 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110575 4774 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110584 4774 flags.go:64] FLAG: --memory-manager-policy="None" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110593 4774 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110602 4774 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110612 4774 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110621 4774 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110641 4774 flags.go:64] FLAG: --node-status-max-images="50" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110650 4774 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110660 4774 flags.go:64] FLAG: --oom-score-adj="-999" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110670 4774 flags.go:64] FLAG: --pod-cidr="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110679 4774 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110693 4774 flags.go:64] FLAG: --pod-manifest-path="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110702 4774 flags.go:64] FLAG: --pod-max-pids="-1" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110712 4774 flags.go:64] FLAG: --pods-per-core="0" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110721 4774 flags.go:64] FLAG: --port="10250" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110730 4774 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110739 4774 flags.go:64] FLAG: --provider-id="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110749 4774 flags.go:64] FLAG: --qos-reserved="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110758 4774 flags.go:64] FLAG: --read-only-port="10255" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110767 4774 flags.go:64] FLAG: --register-node="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110776 4774 flags.go:64] FLAG: --register-schedulable="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110787 4774 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110804 4774 flags.go:64] FLAG: --registry-burst="10" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110813 4774 flags.go:64] FLAG: --registry-qps="5" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110823 4774 flags.go:64] FLAG: --reserved-cpus="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110832 4774 flags.go:64] FLAG: --reserved-memory="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110843 4774 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110853 4774 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110868 4774 flags.go:64] FLAG: --rotate-certificates="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110901 4774 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110911 4774 flags.go:64] FLAG: --runonce="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110920 4774 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110930 4774 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110939 4774 flags.go:64] FLAG: --seccomp-default="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110949 4774 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110958 4774 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110967 4774 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110976 4774 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110986 4774 flags.go:64] FLAG: --storage-driver-password="root" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.110994 4774 flags.go:64] FLAG: --storage-driver-secure="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111003 4774 flags.go:64] FLAG: --storage-driver-table="stats" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111012 4774 flags.go:64] FLAG: --storage-driver-user="root" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111021 4774 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111030 4774 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111039 4774 flags.go:64] FLAG: --system-cgroups="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111049 4774 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111062 4774 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111071 4774 flags.go:64] FLAG: --tls-cert-file="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111080 4774 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111090 4774 flags.go:64] FLAG: --tls-min-version="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111099 4774 flags.go:64] FLAG: --tls-private-key-file="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111108 4774 flags.go:64] FLAG: --topology-manager-policy="none" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111117 4774 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111127 4774 flags.go:64] FLAG: --topology-manager-scope="container" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111135 4774 flags.go:64] FLAG: --v="2" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111147 4774 flags.go:64] FLAG: --version="false" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111158 4774 flags.go:64] FLAG: --vmodule="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111169 4774 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.111178 4774 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111413 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111426 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111435 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111443 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111453 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111462 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111472 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111480 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111488 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111496 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111505 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111513 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111521 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111529 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111537 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111547 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111557 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111565 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111573 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111581 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111589 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111597 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111605 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111617 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111627 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111636 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111646 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111656 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111664 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111673 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111682 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111690 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111698 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111708 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111716 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111724 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111732 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111740 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111748 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111756 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111764 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111772 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111781 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111789 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111796 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111804 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111812 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111820 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111830 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111840 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111847 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111862 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111870 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111900 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111908 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111916 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111924 4774 feature_gate.go:330] unrecognized feature gate: Example Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111932 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111942 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111953 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111963 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111972 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111981 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111989 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.111998 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112006 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112013 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112021 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112029 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112036 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.112044 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.112057 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.121143 4774 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.121180 4774 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122180 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122450 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122466 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122484 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122497 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122509 4774 feature_gate.go:330] unrecognized feature gate: Example Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122521 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122533 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122546 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122557 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122787 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122803 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122814 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122824 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122836 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122849 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.122897 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123152 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123166 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123175 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123186 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123197 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123206 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123215 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123658 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123671 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123680 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123693 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123708 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123717 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123726 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123734 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123743 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123751 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123759 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123770 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123778 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123787 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123795 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123803 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123814 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123824 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123833 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123843 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123854 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123868 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123900 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123910 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123918 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123926 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123934 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123942 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123949 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123957 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123965 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123974 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123981 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.123991 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124002 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124011 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124020 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124028 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124038 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124046 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124055 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124063 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124071 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124080 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124088 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124096 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124105 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.124119 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124386 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124401 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124410 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124419 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124428 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124436 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124444 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124453 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124461 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124469 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124477 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124485 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124493 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124502 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124510 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124519 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124527 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124535 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124547 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124560 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124571 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124580 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124589 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124601 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124611 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124623 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124634 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124646 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124657 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124668 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124679 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124690 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124701 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124712 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124722 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124732 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124742 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124751 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124759 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124767 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124775 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124782 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124790 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124797 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124805 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124813 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124821 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124829 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124837 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124844 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124852 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124867 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124874 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124906 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124914 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124924 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124934 4774 feature_gate.go:330] unrecognized feature gate: Example Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124942 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124951 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124960 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124970 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124979 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124988 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.124996 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125004 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125014 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125024 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125033 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125041 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125049 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.125057 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.125069 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.125965 4774 server.go:940] "Client rotation is on, will bootstrap in background" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.135150 4774 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.135592 4774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.137617 4774 server.go:997] "Starting client certificate rotation" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.137675 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.138889 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-13 07:12:00.938935368 +0000 UTC Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.139039 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.162845 4774 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.165701 4774 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.166092 4774 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.183437 4774 log.go:25] "Validated CRI v1 runtime API" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.224720 4774 log.go:25] "Validated CRI v1 image API" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.229448 4774 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.234247 4774 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-27-00-01-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.234321 4774 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.270207 4774 manager.go:217] Machine: {Timestamp:2026-01-27 00:07:02.265728101 +0000 UTC m=+0.571505055 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e18b2370-db20-4c66-88f9-fff4652ef035 BootID:798433ee-0aed-45e3-8b2f-39b7bf5cbb06 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:af:4d:f9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:af:4d:f9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:07:29:13 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b4:fd:3e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7b:50:df Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ea:e7:8d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:34:06:c0:b8:79 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:7a:be:f5:90:45 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.270606 4774 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.270818 4774 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.273237 4774 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.273647 4774 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.273714 4774 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.274221 4774 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.274242 4774 container_manager_linux.go:303] "Creating device plugin manager" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.274771 4774 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.274827 4774 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.275158 4774 state_mem.go:36] "Initialized new in-memory state store" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.276071 4774 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280502 4774 kubelet.go:418] "Attempting to sync node with API server" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280550 4774 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280606 4774 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280634 4774 kubelet.go:324] "Adding apiserver pod source" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.280657 4774 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.285024 4774 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.285100 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.285167 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.285490 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.285557 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.285951 4774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.288145 4774 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289919 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289942 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289951 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289960 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289975 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289985 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.289994 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290006 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290014 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290023 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290032 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.290038 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.292113 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.292458 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.292911 4774 server.go:1280] "Started kubelet" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294086 4774 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294121 4774 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294681 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294705 4774 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294715 4774 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294742 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 08:53:47.390547811 +0000 UTC Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294841 4774 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.294866 4774 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.295022 4774 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 27 00:07:02 crc systemd[1]: Started Kubernetes Kubelet. Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.294982 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.295824 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.295968 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297176 4774 factory.go:55] Registering systemd factory Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297222 4774 factory.go:221] Registration of the systemd container factory successfully Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297539 4774 factory.go:153] Registering CRI-O factory Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297581 4774 factory.go:221] Registration of the crio container factory successfully Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297670 4774 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297697 4774 factory.go:103] Registering Raw factory Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.297714 4774 manager.go:1196] Started watching for new ooms in manager Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.297674 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="200ms" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.306704 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e6dcb029d1889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:07:02.292822153 +0000 UTC m=+0.598599077,LastTimestamp:2026-01-27 00:07:02.292822153 +0000 UTC m=+0.598599077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.312490 4774 manager.go:319] Starting recovery of all containers Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.312647 4774 server.go:460] "Adding debug handlers to kubelet server" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321348 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321412 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321486 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321518 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321533 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321548 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321563 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321581 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321600 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321616 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321633 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321651 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321667 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321685 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321701 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.321719 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322102 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322121 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322136 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322151 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322169 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322186 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322201 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322257 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322287 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322336 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322355 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322372 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322387 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322402 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322418 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322436 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322452 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322468 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322482 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322499 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322537 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322553 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322569 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322585 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322601 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322618 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322635 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322652 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322672 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322690 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322707 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322729 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322746 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322763 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322783 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322803 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322826 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322846 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.322990 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323019 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323039 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323058 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323075 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323092 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323109 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323127 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323178 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323196 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323211 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323227 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323275 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323292 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323309 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323324 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323370 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323416 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323800 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323828 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323847 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323888 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323908 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.323926 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.324937 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325049 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325084 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325114 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325145 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325172 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325202 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325240 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325272 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325302 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325333 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325360 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325387 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325416 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325445 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325473 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325501 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325533 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325560 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325590 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325617 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325647 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325678 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325708 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325734 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325763 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325812 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325847 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325932 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325968 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.325998 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326028 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326059 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326093 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326124 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326152 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326182 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326211 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326239 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326270 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326346 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326376 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326408 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326437 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326463 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326492 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326518 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326544 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326572 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326606 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326635 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326660 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326691 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326718 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326749 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326778 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326808 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326841 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326903 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326935 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.326971 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327014 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327042 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327068 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327097 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327131 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327158 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327184 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327212 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327238 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327267 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327294 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327321 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327349 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327376 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327402 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327428 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327457 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327487 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327565 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327598 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327630 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327654 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327683 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327709 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327734 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327773 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327798 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327826 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327851 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327914 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327944 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.327973 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328000 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328024 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328049 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328076 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328110 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328134 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328164 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328191 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328222 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328249 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328271 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328295 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328318 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328344 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328369 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328395 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328418 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328445 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328472 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328500 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328530 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328554 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328583 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328610 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328634 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328664 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328691 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328716 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328745 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328775 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328803 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328830 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.328854 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331153 4774 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331209 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331244 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331277 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331302 4774 reconstruct.go:97] "Volume reconstruction finished" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.331318 4774 reconciler.go:26] "Reconciler: start to sync state" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.334361 4774 manager.go:324] Recovery completed Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.347790 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.352285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.352333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.352358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.353278 4774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.353736 4774 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.353774 4774 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.353809 4774 state_mem.go:36] "Initialized new in-memory state store" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.355259 4774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.355314 4774 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.355352 4774 kubelet.go:2335] "Starting kubelet main sync loop" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.355408 4774 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.359104 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.359192 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.370071 4774 policy_none.go:49] "None policy: Start" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.371560 4774 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.371586 4774 state_mem.go:35] "Initializing new in-memory state store" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.395379 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417109 4774 manager.go:334] "Starting Device Plugin manager" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417155 4774 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417167 4774 server.go:79] "Starting device plugin registration server" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417553 4774 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.417569 4774 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.419385 4774 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.419512 4774 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.419521 4774 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.427434 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.456224 4774 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.456326 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457725 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457852 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.457904 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458806 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458954 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.458994 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459558 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459683 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459727 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.459988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460582 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460726 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.460783 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461596 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461619 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.461817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.462314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.462352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.462363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.508242 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="400ms" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.518711 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.519684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.519732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.519744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.519776 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.520244 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534198 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534281 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534320 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534390 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534430 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534461 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534507 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534531 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534551 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534571 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534593 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534614 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534697 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.534730 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636133 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636176 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636242 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636266 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636285 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636313 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636332 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636358 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636401 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636486 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636561 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636573 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636586 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636601 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636432 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636619 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636646 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636695 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636671 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636717 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636756 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636756 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636773 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.636971 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.720771 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.722016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.722048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.722056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.722076 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.722505 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.784379 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.789905 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.796717 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.820462 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: I0127 00:07:02.827319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.842112 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-344b8c1ba8721614715bb49ec8fdf2492938ad911ba5b32de4fcab529b1a5477 WatchSource:0}: Error finding container 344b8c1ba8721614715bb49ec8fdf2492938ad911ba5b32de4fcab529b1a5477: Status 404 returned error can't find the container with id 344b8c1ba8721614715bb49ec8fdf2492938ad911ba5b32de4fcab529b1a5477 Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.844514 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-13b6370963b2822656a64f9f14e054da3bebc442e405ee4128d931f924da78a2 WatchSource:0}: Error finding container 13b6370963b2822656a64f9f14e054da3bebc442e405ee4128d931f924da78a2: Status 404 returned error can't find the container with id 13b6370963b2822656a64f9f14e054da3bebc442e405ee4128d931f924da78a2 Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.849131 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-9584501c7473a433e1a66a1dccc481a2441c33db7652822a4bda4a47fec5031c WatchSource:0}: Error finding container 9584501c7473a433e1a66a1dccc481a2441c33db7652822a4bda4a47fec5031c: Status 404 returned error can't find the container with id 9584501c7473a433e1a66a1dccc481a2441c33db7652822a4bda4a47fec5031c Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.856699 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-24ee2ebed2f311488e856ce85d820e679b7f17dc7ddb525aabb3210abda7f946 WatchSource:0}: Error finding container 24ee2ebed2f311488e856ce85d820e679b7f17dc7ddb525aabb3210abda7f946: Status 404 returned error can't find the container with id 24ee2ebed2f311488e856ce85d820e679b7f17dc7ddb525aabb3210abda7f946 Jan 27 00:07:02 crc kubenswrapper[4774]: W0127 00:07:02.862464 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-315500a735b1da0e9ee70be9654654f384727f76c19428e6340f388ed76f6fd8 WatchSource:0}: Error finding container 315500a735b1da0e9ee70be9654654f384727f76c19428e6340f388ed76f6fd8: Status 404 returned error can't find the container with id 315500a735b1da0e9ee70be9654654f384727f76c19428e6340f388ed76f6fd8 Jan 27 00:07:02 crc kubenswrapper[4774]: E0127 00:07:02.909332 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="800ms" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.123555 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.126677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.126722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.126732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.126759 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.127231 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.293245 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.295281 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 13:37:37.237960573 +0000 UTC Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.359529 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"24ee2ebed2f311488e856ce85d820e679b7f17dc7ddb525aabb3210abda7f946"} Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.360414 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9584501c7473a433e1a66a1dccc481a2441c33db7652822a4bda4a47fec5031c"} Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.361947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"344b8c1ba8721614715bb49ec8fdf2492938ad911ba5b32de4fcab529b1a5477"} Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.362957 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"13b6370963b2822656a64f9f14e054da3bebc442e405ee4128d931f924da78a2"} Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.365093 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"315500a735b1da0e9ee70be9654654f384727f76c19428e6340f388ed76f6fd8"} Jan 27 00:07:03 crc kubenswrapper[4774]: W0127 00:07:03.366730 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.366803 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:03 crc kubenswrapper[4774]: W0127 00:07:03.474309 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.474404 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.709929 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="1.6s" Jan 27 00:07:03 crc kubenswrapper[4774]: W0127 00:07:03.832421 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.832527 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.927343 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.928989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.929025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.929037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:03 crc kubenswrapper[4774]: I0127 00:07:03.929065 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.929607 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:03 crc kubenswrapper[4774]: W0127 00:07:03.936045 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:03 crc kubenswrapper[4774]: E0127 00:07:03.936142 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:04 crc kubenswrapper[4774]: E0127 00:07:04.196598 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e6dcb029d1889 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:07:02.292822153 +0000 UTC m=+0.598599077,LastTimestamp:2026-01-27 00:07:02.292822153 +0000 UTC m=+0.598599077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.293765 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.295802 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:58:05.587295729 +0000 UTC Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.366412 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 00:07:04 crc kubenswrapper[4774]: E0127 00:07:04.367331 4774 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.369833 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283" exitCode=0 Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.370096 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.372825 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.376897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.376935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.376947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.380370 4774 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1aaa7b231202686b65f633283644052e65287917c7621e708f036c1a7278863b" exitCode=0 Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.380455 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1aaa7b231202686b65f633283644052e65287917c7621e708f036c1a7278863b"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.380581 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.384190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.384214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.384224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.385698 4774 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537" exitCode=0 Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.385819 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.386412 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.386994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.387027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.387037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390447 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390490 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390506 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390520 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.390614 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.391973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.392038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.392064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.407677 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea" exitCode=0 Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.409302 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.409903 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea"} Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.413447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.413481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.413494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.415353 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.416297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.416368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:04 crc kubenswrapper[4774]: I0127 00:07:04.416388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.293760 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.296001 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 17:15:04.561834927 +0000 UTC Jan 27 00:07:05 crc kubenswrapper[4774]: E0127 00:07:05.310585 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="3.2s" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.416056 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d0e4428feac8787d889cfa23715983ac112bac861e3e00ab7ae19ec676964eb4"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.416119 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.417419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.417449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.417460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.424899 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425054 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425143 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425166 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.425842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429035 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429080 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429101 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429119 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429137 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.429273 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.430210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.430248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.430265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.432767 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8" exitCode=0 Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.432814 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8"} Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.432879 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.433047 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.433747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.433779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.433790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.434526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.434548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.434564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.463065 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.537911 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.538993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.539030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.539042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:05 crc kubenswrapper[4774]: I0127 00:07:05.539069 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:05 crc kubenswrapper[4774]: E0127 00:07:05.539474 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.27:6443: connect: connection refused" node="crc" Jan 27 00:07:05 crc kubenswrapper[4774]: W0127 00:07:05.580368 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.27:6443: connect: connection refused Jan 27 00:07:05 crc kubenswrapper[4774]: E0127 00:07:05.580438 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.27:6443: connect: connection refused" logger="UnhandledError" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.296573 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 03:51:15.368636641 +0000 UTC Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440025 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d" exitCode=0 Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440257 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440334 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440404 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440440 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440475 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440340 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.440999 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d"} Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.441093 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.442972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:06 crc kubenswrapper[4774]: I0127 00:07:06.700650 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.297165 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 07:06:57.386613272 +0000 UTC Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445892 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445941 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445951 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445961 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445969 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd"} Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445989 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445999 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.445999 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.447440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:07 crc kubenswrapper[4774]: I0127 00:07:07.798336 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.254146 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.254400 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.256210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.256259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.256272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.294287 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.298351 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 06:56:27.091026964 +0000 UTC Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.410350 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.449098 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.449145 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.449195 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.450916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.450975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.450920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.451971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.452011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.730659 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.740354 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.742126 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.742163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.742176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.742201 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 00:07:08 crc kubenswrapper[4774]: I0127 00:07:08.805749 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.298590 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 21:18:23.695541619 +0000 UTC Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.451155 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.451345 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:09 crc kubenswrapper[4774]: I0127 00:07:09.452583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:10 crc kubenswrapper[4774]: I0127 00:07:10.299367 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 11:42:04.254806817 +0000 UTC Jan 27 00:07:11 crc kubenswrapper[4774]: I0127 00:07:11.299827 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 00:13:25.59936718 +0000 UTC Jan 27 00:07:11 crc kubenswrapper[4774]: I0127 00:07:11.806429 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 00:07:11 crc kubenswrapper[4774]: I0127 00:07:11.806574 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.300532 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 00:41:09.599863086 +0000 UTC Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.339999 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.340243 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.341786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.341850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.341896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.359930 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 27 00:07:12 crc kubenswrapper[4774]: E0127 00:07:12.427590 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.461534 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.463281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.463348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:12 crc kubenswrapper[4774]: I0127 00:07:12.463372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.301104 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 09:15:22.562955207 +0000 UTC Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.381046 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.381376 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.383797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.384172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.384359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.392118 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.463654 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.465040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.465116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:13 crc kubenswrapper[4774]: I0127 00:07:13.465141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:14 crc kubenswrapper[4774]: I0127 00:07:14.301388 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 16:31:54.600609977 +0000 UTC Jan 27 00:07:15 crc kubenswrapper[4774]: I0127 00:07:15.301949 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 02:02:45.967839176 +0000 UTC Jan 27 00:07:15 crc kubenswrapper[4774]: W0127 00:07:15.839765 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 27 00:07:15 crc kubenswrapper[4774]: I0127 00:07:15.839938 4774 trace.go:236] Trace[238075526]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 00:07:05.838) (total time: 10001ms): Jan 27 00:07:15 crc kubenswrapper[4774]: Trace[238075526]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:15.839) Jan 27 00:07:15 crc kubenswrapper[4774]: Trace[238075526]: [10.001749827s] [10.001749827s] END Jan 27 00:07:15 crc kubenswrapper[4774]: E0127 00:07:15.839971 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.295535 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.302677 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:48:26.66934062 +0000 UTC Jan 27 00:07:16 crc kubenswrapper[4774]: W0127 00:07:16.317043 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.317179 4774 trace.go:236] Trace[860708573]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 00:07:06.315) (total time: 10002ms): Jan 27 00:07:16 crc kubenswrapper[4774]: Trace[860708573]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:16.317) Jan 27 00:07:16 crc kubenswrapper[4774]: Trace[860708573]: [10.002120217s] [10.002120217s] END Jan 27 00:07:16 crc kubenswrapper[4774]: E0127 00:07:16.317296 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 27 00:07:16 crc kubenswrapper[4774]: W0127 00:07:16.411944 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.412363 4774 trace.go:236] Trace[728287144]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 00:07:06.410) (total time: 10002ms): Jan 27 00:07:16 crc kubenswrapper[4774]: Trace[728287144]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:16.411) Jan 27 00:07:16 crc kubenswrapper[4774]: Trace[728287144]: [10.002118508s] [10.002118508s] END Jan 27 00:07:16 crc kubenswrapper[4774]: E0127 00:07:16.412525 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.474554 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.476983 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7" exitCode=255 Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.477031 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7"} Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.477179 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.477975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.478007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.478017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:16 crc kubenswrapper[4774]: I0127 00:07:16.478476 4774 scope.go:117] "RemoveContainer" containerID="5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.153807 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.153930 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.157641 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.157701 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.303180 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 04:04:54.521765215 +0000 UTC Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.482112 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.485332 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c"} Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.485603 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.487004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.487061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:17 crc kubenswrapper[4774]: I0127 00:07:17.487081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.303903 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 07:42:40.401504064 +0000 UTC Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.740367 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.740574 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.740760 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.742111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.742155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.742170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:18 crc kubenswrapper[4774]: I0127 00:07:18.747285 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.304022 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 03:54:04.515640344 +0000 UTC Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.493133 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.494577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.494623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:19 crc kubenswrapper[4774]: I0127 00:07:19.494635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.304353 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:17:17.12335762 +0000 UTC Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.495504 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.496229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.496256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:20 crc kubenswrapper[4774]: I0127 00:07:20.496265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:21 crc kubenswrapper[4774]: I0127 00:07:21.304751 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 14:07:34.148355546 +0000 UTC Jan 27 00:07:21 crc kubenswrapper[4774]: I0127 00:07:21.806623 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 00:07:21 crc kubenswrapper[4774]: I0127 00:07:21.806710 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 27 00:07:21 crc kubenswrapper[4774]: I0127 00:07:21.819083 4774 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.150570 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.154301 4774 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.157544 4774 trace.go:236] Trace[594448485]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 00:07:11.346) (total time: 10811ms): Jan 27 00:07:22 crc kubenswrapper[4774]: Trace[594448485]: ---"Objects listed" error: 10811ms (00:07:22.157) Jan 27 00:07:22 crc kubenswrapper[4774]: Trace[594448485]: [10.811328741s] [10.811328741s] END Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.157575 4774 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.161402 4774 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.163679 4774 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.163782 4774 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.163801 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.168472 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.181603 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.185520 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.187571 4774 csr.go:261] certificate signing request csr-mj7kx is approved, waiting to be issued Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.196047 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.196754 4774 csr.go:257] certificate signing request csr-mj7kx is issued Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.201271 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.233032 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239356 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239371 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.239421 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.255387 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.255500 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.255523 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.266941 4774 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.292708 4774 apiserver.go:52] "Watching apiserver" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.295789 4774 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296016 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296322 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296334 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.296373 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296428 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.296450 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296667 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296687 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.296779 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.296808 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.299224 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.299356 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.299962 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300000 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300086 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300552 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300729 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300758 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.300987 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.305013 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 20:48:28.86236562 +0000 UTC Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.331274 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.343469 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.356374 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.359738 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.376576 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.389659 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.392392 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.395614 4774 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.399683 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.408347 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.414517 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.422663 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.435905 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.445358 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456343 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456399 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456424 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456453 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456479 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456502 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456524 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456689 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456731 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456758 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456802 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456830 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456877 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456902 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456923 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456946 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.456984 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457011 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457033 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457070 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457118 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457136 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457166 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457200 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457171 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457222 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457249 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457270 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457293 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457315 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457339 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457381 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457404 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457428 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457451 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457474 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457494 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457521 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457540 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457559 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457630 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457651 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457669 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457686 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457706 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457727 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457752 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457771 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457792 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457810 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457830 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457875 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457902 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457922 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457942 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457961 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457979 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457997 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458019 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458037 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458058 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458107 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458131 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458150 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458167 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458185 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458202 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458225 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458243 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458261 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458280 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458304 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458323 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458369 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458388 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458409 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458428 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458448 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458467 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458486 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458505 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458523 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458540 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458558 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458576 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458593 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458611 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458629 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458647 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458668 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458685 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458705 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458724 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458742 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458761 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458780 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458814 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458834 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458870 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458896 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458926 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458948 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458967 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458982 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458999 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459015 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459033 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459053 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459075 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459095 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459111 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459131 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459151 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459166 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459187 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459220 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459239 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459257 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459219 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459277 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459296 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459314 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459334 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459357 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459381 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459403 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459423 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459441 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459462 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459481 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459514 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459533 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459550 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459567 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459586 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459616 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459632 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459648 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459665 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459683 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459699 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459715 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459732 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459752 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459769 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459788 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459806 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459825 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459842 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457493 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.457689 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458495 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.458817 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.459849 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460270 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460430 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460459 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460501 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460552 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460612 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460688 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460827 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460763 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460912 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460870 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460939 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.460935 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461151 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461125 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461226 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461256 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461343 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461351 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461438 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461492 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461509 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461586 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461593 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461566 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461645 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461706 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461771 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.461997 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462443 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462471 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462508 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462566 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462589 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462809 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462811 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462830 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462849 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462950 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462981 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463115 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463268 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463273 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463214 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463450 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463630 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.463830 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.464168 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.464794 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.465621 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.465949 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.466222 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.466233 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.466509 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.466702 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.467289 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:22.967255891 +0000 UTC m=+21.273032995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.467532 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468031 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468299 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468427 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468679 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468697 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.468997 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469108 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469108 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469363 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469402 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469652 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469915 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469925 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469957 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.469972 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470265 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470354 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470489 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.462886 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470700 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470916 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.470963 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471182 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471129 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471360 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471518 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471547 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471645 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471759 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471781 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471806 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471836 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471885 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.471989 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.472149 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.472604 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.472994 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473418 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473453 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473481 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473501 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473520 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473544 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473368 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.473729 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.474050 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.474777 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.475253 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.475504 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.475903 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.475993 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.477063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.477686 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.477834 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.478348 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.479121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.479111 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.479593 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.480522 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.480756 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.484486 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.484562 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.484848 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.485128 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.485319 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.487064 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.487871 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.488127 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.488539 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489015 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489105 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489318 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489528 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489573 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.489956 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.490269 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.490508 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.490741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.491218 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.491516 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.492027 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494147 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494210 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494418 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494491 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494531 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494554 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494584 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494607 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494635 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494656 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494679 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494702 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494721 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494739 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494757 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494759 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494774 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494777 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494795 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494815 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494834 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494851 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494888 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494932 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494955 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494983 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495002 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.494939 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495029 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495052 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495074 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495097 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495134 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495150 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495174 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495268 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495296 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502709 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502743 4774 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502771 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502789 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502801 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502812 4774 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502836 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502847 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502875 4774 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502886 4774 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502901 4774 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502913 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502924 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502936 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502946 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502955 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502974 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502988 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.502999 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503010 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503022 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503035 4774 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503045 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503056 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503602 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503618 4774 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503714 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503735 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503969 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503987 4774 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503999 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504011 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504027 4774 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504302 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504320 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504372 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504389 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504401 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504437 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504454 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504465 4774 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506270 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506316 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506333 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507566 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507599 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507619 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507633 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507650 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507662 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507675 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507696 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507713 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507728 4774 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507739 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507754 4774 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507764 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507774 4774 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507785 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.495169 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498017 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498291 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507892 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506237 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498278 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508054 4774 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508115 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508150 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508163 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508186 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508202 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498354 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509630 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.498716 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509672 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.509691 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499280 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499370 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499390 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499409 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509946 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.500199 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503129 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510050 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.508215 4774 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503443 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503804 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.503830 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.504893 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.505022 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.505453 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.505889 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.510415 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:23.009749024 +0000 UTC m=+21.315525908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510502 4774 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510601 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510635 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510659 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510846 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.510962 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511094 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511116 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511127 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511148 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511164 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511179 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511192 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511203 4774 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511213 4774 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511224 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506023 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.496616 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.496751 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506066 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.506719 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507381 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.511297 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:23.011222835 +0000 UTC m=+21.316999839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.507707 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.505954 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509411 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509434 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.499002 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.509881 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511426 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511475 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.511518 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512118 4774 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512165 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512209 4774 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512250 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512594 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512757 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513117 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513140 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513313 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513498 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513584 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.513514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514822 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.512266 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514972 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514997 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515013 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515028 4774 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515043 4774 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515057 4774 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515075 4774 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515090 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515102 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515117 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515132 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515146 4774 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515159 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515174 4774 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515190 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515206 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515220 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515233 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515251 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515263 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515277 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515290 4774 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515314 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515328 4774 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515341 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515356 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515369 4774 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515382 4774 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515395 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515408 4774 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515420 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515476 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515491 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515504 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515517 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515531 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515544 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515556 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515569 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515582 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515594 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515607 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515621 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515636 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.515651 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.521636 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522024 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522186 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522425 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522531 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.522584 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523011 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523026 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523039 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523102 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523118 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523195 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.523261 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.524600 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.529751 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.514178 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.530992 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531019 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531034 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531113 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:23.031084658 +0000 UTC m=+21.336861542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531242 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531266 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531319 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:22 crc kubenswrapper[4774]: E0127 00:07:22.531402 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:23.031386866 +0000 UTC m=+21.337163750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.536337 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.536627 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.539012 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.544574 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.545275 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.546336 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.546478 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.550843 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.551057 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.554193 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.563549 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.565531 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616593 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616654 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616735 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616782 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616772 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616808 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616797 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616922 4774 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616936 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616945 4774 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616956 4774 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616965 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616975 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616984 4774 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.616992 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617004 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617014 4774 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617022 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617030 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617039 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617048 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617056 4774 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617064 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617072 4774 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617081 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617088 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617096 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617105 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617113 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617121 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617130 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617138 4774 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617148 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617157 4774 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617166 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617174 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617185 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617193 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617201 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617209 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617222 4774 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617230 4774 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617238 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617246 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617254 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617263 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617271 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617279 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617287 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617296 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617305 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617314 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617322 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617331 4774 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617341 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617350 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617359 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617380 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617389 4774 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617404 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617415 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617424 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617433 4774 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617442 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617452 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617462 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617472 4774 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617480 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617489 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617498 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617507 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.617516 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.620332 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.622295 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637319 4774 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637212 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.637639 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.663757 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.685842 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.709336 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.719750 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.739679 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.841822 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.907615 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.915594 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 00:07:22 crc kubenswrapper[4774]: W0127 00:07:22.926798 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ffc4d2005b1fabc86bdde6c308e5a13a330c58baca0ea213e2e6919a116aa173 WatchSource:0}: Error finding container ffc4d2005b1fabc86bdde6c308e5a13a330c58baca0ea213e2e6919a116aa173: Status 404 returned error can't find the container with id ffc4d2005b1fabc86bdde6c308e5a13a330c58baca0ea213e2e6919a116aa173 Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944566 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:22 crc kubenswrapper[4774]: I0127 00:07:22.944677 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:22Z","lastTransitionTime":"2026-01-27T00:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.020884 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.020981 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.021007 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.021043 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.021011568 +0000 UTC m=+22.326788472 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.021094 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.021169 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.021134761 +0000 UTC m=+22.326911645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.021978 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.022109 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.022067747 +0000 UTC m=+22.327844641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.046849 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.121834 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.121900 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.121997 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122004 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122013 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122021 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122025 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122031 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122067 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.122055521 +0000 UTC m=+22.427832405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.122080 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:24.122074932 +0000 UTC m=+22.427851816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.148821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.198170 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-27 00:02:22 +0000 UTC, rotation deadline is 2026-10-15 18:07:14.396689037 +0000 UTC Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.198260 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6281h59m51.198432172s for next certificate rotation Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.250770 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.306132 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 21:16:24.767943871 +0000 UTC Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.354211 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.456568 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.475643 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8mtkj"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.476010 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.477973 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.478032 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.478915 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.489988 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.502386 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.514611 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.518146 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.518198 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.518210 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"453c4954e146e01f71abf01febc41d8911b76c8cec3fd440a447e8f30a89e500"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.519722 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.520240 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.522095 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" exitCode=255 Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.522151 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.522219 4774 scope.go:117] "RemoveContainer" containerID="5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.523552 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ffc4d2005b1fabc86bdde6c308e5a13a330c58baca0ea213e2e6919a116aa173"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.525318 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb889\" (UniqueName: \"kubernetes.io/projected/052efff3-b53a-4586-8ccf-f8e9b1a47174-kube-api-access-jb889\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.525400 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/052efff3-b53a-4586-8ccf-f8e9b1a47174-hosts-file\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.525921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.525987 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ed0aeac8215083856262c3a8929ff514c2bb0a9d94c84b6d6fba293d94ce8c43"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.531381 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.531950 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:07:23 crc kubenswrapper[4774]: E0127 00:07:23.532148 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.537419 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.548461 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.558106 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.564481 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.576574 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.593691 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.606389 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.618712 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.626608 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/052efff3-b53a-4586-8ccf-f8e9b1a47174-hosts-file\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.626668 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb889\" (UniqueName: \"kubernetes.io/projected/052efff3-b53a-4586-8ccf-f8e9b1a47174-kube-api-access-jb889\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.626906 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/052efff3-b53a-4586-8ccf-f8e9b1a47174-hosts-file\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.629178 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.646946 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb889\" (UniqueName: \"kubernetes.io/projected/052efff3-b53a-4586-8ccf-f8e9b1a47174-kube-api-access-jb889\") pod \"node-resolver-8mtkj\" (UID: \"052efff3-b53a-4586-8ccf-f8e9b1a47174\") " pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.651873 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.659771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.659966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.659989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.660008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.660023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.663127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.676059 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.688001 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.701072 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:16Z\\\",\\\"message\\\":\\\"W0127 00:07:05.411600 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 00:07:05.412345 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769472425 cert, and key in /tmp/serving-cert-1556577615/serving-signer.crt, /tmp/serving-cert-1556577615/serving-signer.key\\\\nI0127 00:07:05.647477 1 observer_polling.go:159] Starting file observer\\\\nW0127 00:07:05.650164 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 00:07:05.650261 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:05.650830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1556577615/tls.crt::/tmp/serving-cert-1556577615/tls.key\\\\\\\"\\\\nF0127 00:07:16.006924 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.714404 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.762282 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.788521 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8mtkj" Jan 27 00:07:23 crc kubenswrapper[4774]: W0127 00:07:23.797100 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod052efff3_b53a_4586_8ccf_f8e9b1a47174.slice/crio-832a614ba3a82e5972be975ebb0289acdca88093cb98d66a001f2f8bbd475dc3 WatchSource:0}: Error finding container 832a614ba3a82e5972be975ebb0289acdca88093cb98d66a001f2f8bbd475dc3: Status 404 returned error can't find the container with id 832a614ba3a82e5972be975ebb0289acdca88093cb98d66a001f2f8bbd475dc3 Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.860765 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2nl9s"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.861047 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-57k5g"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.861366 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864339 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mtz9l"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864539 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864611 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864633 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.864544 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5rgv"] Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.865049 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.865743 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.865769 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.866430 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.868572 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.868596 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.868901 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.868927 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.869119 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.869283 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.869385 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.872616 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873687 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.873874 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.874070 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.874230 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.875714 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.876518 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.879100 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.885526 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:16Z\\\",\\\"message\\\":\\\"W0127 00:07:05.411600 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 00:07:05.412345 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769472425 cert, and key in /tmp/serving-cert-1556577615/serving-signer.crt, /tmp/serving-cert-1556577615/serving-signer.key\\\\nI0127 00:07:05.647477 1 observer_polling.go:159] Starting file observer\\\\nW0127 00:07:05.650164 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 00:07:05.650261 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:05.650830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1556577615/tls.crt::/tmp/serving-cert-1556577615/tls.key\\\\\\\"\\\\nF0127 00:07:16.006924 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.906326 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.921309 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.928839 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.928986 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd9gc\" (UniqueName: \"kubernetes.io/projected/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-kube-api-access-dd9gc\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929097 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-bin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929183 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929264 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929343 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cni-binary-copy\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929422 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-multus\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929510 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929605 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-etc-kubernetes\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929695 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9bv\" (UniqueName: \"kubernetes.io/projected/c578fd22-1672-431a-9914-66f55a0260bd-kube-api-access-7b9bv\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929791 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.929903 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930026 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930116 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-os-release\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930277 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-netns\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930451 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930530 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930612 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930690 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-hostroot\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930778 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930879 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmt7t\" (UniqueName: \"kubernetes.io/projected/0abcf78e-9b05-4b89-94f3-4d3230886ce0-kube-api-access-pmt7t\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.930963 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-proxy-tls\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931043 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-rootfs\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931124 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-cnibin\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931220 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931304 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931403 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931487 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-k8s-cni-cncf-io\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931578 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-multus-certs\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931676 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-system-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931758 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cnibin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.931880 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-daemon-config\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.932008 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-os-release\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.932064 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.933971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934072 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934192 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934298 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934401 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-system-cni-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934484 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934619 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934671 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934711 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-socket-dir-parent\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934739 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-kubelet\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934764 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-conf-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.934792 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.936242 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.951420 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.963485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.975654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.975818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.975895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.975957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.976010 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:23Z","lastTransitionTime":"2026-01-27T00:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.985788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:23 crc kubenswrapper[4774]: I0127 00:07:23.997832 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.010116 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.018959 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035246 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.035348 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.03532957 +0000 UTC m=+24.341106454 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035436 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035454 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-bin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035311 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035468 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035484 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035500 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd9gc\" (UniqueName: \"kubernetes.io/projected/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-kube-api-access-dd9gc\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035515 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-multus\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035545 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-bin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035576 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035607 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035614 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035646 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035667 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035669 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-cni-multus\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.035721 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cni-binary-copy\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.035905 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.035968 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.035957047 +0000 UTC m=+24.341733981 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036114 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9bv\" (UniqueName: \"kubernetes.io/projected/c578fd22-1672-431a-9914-66f55a0260bd-kube-api-access-7b9bv\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036238 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036346 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036426 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-etc-kubernetes\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036312 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036431 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cni-binary-copy\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036347 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036455 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-etc-kubernetes\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036510 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036652 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-os-release\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036677 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-netns\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036697 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036720 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036739 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036757 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036759 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-hostroot\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036788 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036792 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-hostroot\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036804 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036819 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmt7t\" (UniqueName: \"kubernetes.io/projected/0abcf78e-9b05-4b89-94f3-4d3230886ce0-kube-api-access-pmt7t\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036837 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-rootfs\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036866 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-proxy-tls\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036882 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036897 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036914 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-cnibin\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036931 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-k8s-cni-cncf-io\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036970 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-multus-certs\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036991 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037007 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037022 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-system-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037036 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cnibin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037053 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-os-release\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037072 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037089 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037106 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-daemon-config\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037147 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037161 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037185 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037199 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-system-cni-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037215 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037246 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037263 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-socket-dir-parent\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.037263 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037277 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-kubelet\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037292 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-conf-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.037313 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.037299724 +0000 UTC m=+24.343076618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037320 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-conf-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037394 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037407 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.036740 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-os-release\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037443 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-netns\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037452 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037483 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037483 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037510 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-cnibin\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037522 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-k8s-cni-cncf-io\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037587 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-run-multus-certs\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037633 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037684 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-system-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037702 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-rootfs\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037727 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-cnibin\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037782 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-os-release\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037800 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037814 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037835 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037832 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037878 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-system-cni-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037915 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-socket-dir-parent\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038014 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-host-var-lib-kubelet\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.037212 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c578fd22-1672-431a-9914-66f55a0260bd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038080 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038100 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-cni-dir\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0abcf78e-9b05-4b89-94f3-4d3230886ce0-multus-daemon-config\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038597 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.038920 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c578fd22-1672-431a-9914-66f55a0260bd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.039352 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.051253 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-proxy-tls\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.055249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9bv\" (UniqueName: \"kubernetes.io/projected/c578fd22-1672-431a-9914-66f55a0260bd-kube-api-access-7b9bv\") pod \"multus-additional-cni-plugins-57k5g\" (UID: \"c578fd22-1672-431a-9914-66f55a0260bd\") " pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.063297 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd9gc\" (UniqueName: \"kubernetes.io/projected/3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a-kube-api-access-dd9gc\") pod \"machine-config-daemon-2nl9s\" (UID: \"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\") " pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.064486 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmt7t\" (UniqueName: \"kubernetes.io/projected/0abcf78e-9b05-4b89-94f3-4d3230886ce0-kube-api-access-pmt7t\") pod \"multus-mtz9l\" (UID: \"0abcf78e-9b05-4b89-94f3-4d3230886ce0\") " pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.069476 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078001 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") pod \"ovnkube-node-l5rgv\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.078966 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.103992 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.131545 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.138441 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.138475 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138583 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138594 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138602 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138610 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138623 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138626 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138662 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.138650956 +0000 UTC m=+24.444427840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.138673 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:26.138668726 +0000 UTC m=+24.444445610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.154278 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.168490 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.181337 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.187735 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.192269 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.200011 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mtz9l" Jan 27 00:07:24 crc kubenswrapper[4774]: W0127 00:07:24.201058 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3605b94c_c171_4ff3_a3c9_d8e6a7cf7a9a.slice/crio-c5a0e3ad5be49be95edfac85a321160c83892fc927992884046993fc624c137c WatchSource:0}: Error finding container c5a0e3ad5be49be95edfac85a321160c83892fc927992884046993fc624c137c: Status 404 returned error can't find the container with id c5a0e3ad5be49be95edfac85a321160c83892fc927992884046993fc624c137c Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.201134 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.208460 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-57k5g" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.216917 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.221215 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: W0127 00:07:24.230742 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc578fd22_1672_431a_9914_66f55a0260bd.slice/crio-a4fb03f368f16229bd0367a444590833fecf3a54be2749f39faf5d93e1bbb4e5 WatchSource:0}: Error finding container a4fb03f368f16229bd0367a444590833fecf3a54be2749f39faf5d93e1bbb4e5: Status 404 returned error can't find the container with id a4fb03f368f16229bd0367a444590833fecf3a54be2749f39faf5d93e1bbb4e5 Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.233266 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: W0127 00:07:24.244660 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb881c9d_a960_48ae_93bf_d0ccd687e0b9.slice/crio-0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83 WatchSource:0}: Error finding container 0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83: Status 404 returned error can't find the container with id 0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83 Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.250073 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.276596 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:16Z\\\",\\\"message\\\":\\\"W0127 00:07:05.411600 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 00:07:05.412345 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769472425 cert, and key in /tmp/serving-cert-1556577615/serving-signer.crt, /tmp/serving-cert-1556577615/serving-signer.key\\\\nI0127 00:07:05.647477 1 observer_polling.go:159] Starting file observer\\\\nW0127 00:07:05.650164 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 00:07:05.650261 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:05.650830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1556577615/tls.crt::/tmp/serving-cert-1556577615/tls.key\\\\\\\"\\\\nF0127 00:07:16.006924 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.286545 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.293642 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.307141 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 07:22:17.506055782 +0000 UTC Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.356474 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.356821 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.356541 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.356917 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.356499 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.356971 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.365299 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.365824 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.367241 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.368060 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.369188 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.369813 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.370504 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.371500 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.372227 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.373302 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.373840 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.375396 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.376139 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.378278 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.379353 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.380548 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.381673 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.386209 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.387090 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.388372 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.389027 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.389629 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.391140 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.395111 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.396738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.396779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.396787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.396804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.397030 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.398344 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.399974 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.401698 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.402977 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.404280 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.405013 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.410175 4774 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.410404 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.413955 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.415385 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.416096 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.418166 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.419491 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.421011 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.423890 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.424834 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.426114 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.427006 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.428336 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.429612 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.430279 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.431419 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.432157 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.436456 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.437010 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.437984 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.438512 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.439208 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.440472 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.441074 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499454 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.499508 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.531106 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" exitCode=0 Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.531198 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.531269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.532576 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerStarted","Data":"ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.532624 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerStarted","Data":"a4fb03f368f16229bd0367a444590833fecf3a54be2749f39faf5d93e1bbb4e5"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.535086 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.538016 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:07:24 crc kubenswrapper[4774]: E0127 00:07:24.538159 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.539692 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.539743 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"5b6b8d7c26b8d55b60d556c03d62d48dff6444cff0ca626540bec4b5124c5cc3"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.541371 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.541843 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.541939 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"c5a0e3ad5be49be95edfac85a321160c83892fc927992884046993fc624c137c"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.553109 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8mtkj" event={"ID":"052efff3-b53a-4586-8ccf-f8e9b1a47174","Type":"ContainerStarted","Data":"07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.553159 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8mtkj" event={"ID":"052efff3-b53a-4586-8ccf-f8e9b1a47174","Type":"ContainerStarted","Data":"832a614ba3a82e5972be975ebb0289acdca88093cb98d66a001f2f8bbd475dc3"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.571039 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ebbc17f79533614a14a0f1e296512fcd303858d2c2f0b4b4e925c54238af8c7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:16Z\\\",\\\"message\\\":\\\"W0127 00:07:05.411600 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 00:07:05.412345 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769472425 cert, and key in /tmp/serving-cert-1556577615/serving-signer.crt, /tmp/serving-cert-1556577615/serving-signer.key\\\\nI0127 00:07:05.647477 1 observer_polling.go:159] Starting file observer\\\\nW0127 00:07:05.650164 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 00:07:05.650261 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:05.650830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1556577615/tls.crt::/tmp/serving-cert-1556577615/tls.key\\\\\\\"\\\\nF0127 00:07:16.006924 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.586798 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.601902 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.607142 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.620066 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.633163 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.654105 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.669789 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.686114 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.703904 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.704958 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.719894 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.733244 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.744504 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.766131 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.783301 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.799319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.808489 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.818011 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.829507 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.853061 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.879459 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.899256 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.911274 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:24Z","lastTransitionTime":"2026-01-27T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.919230 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.956303 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.985662 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:24 crc kubenswrapper[4774]: I0127 00:07:24.998547 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.013597 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.020821 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.035232 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.116352 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.219817 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.308329 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 23:09:18.563032012 +0000 UTC Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.323802 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.426311 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.529417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.529793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.530119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.530230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.530294 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.557410 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562284 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562337 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562354 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562367 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.562381 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.563429 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a" exitCode=0 Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.564006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.576017 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.594994 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.615973 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.632487 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.632940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.632974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.632987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.633003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.633014 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.649145 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.666759 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.680575 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.702093 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.723216 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.737443 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.739153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.755502 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.766284 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.788018 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.802588 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.814055 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.827018 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.840594 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.841909 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.853843 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.871336 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.883681 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.905263 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.920333 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.933682 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.944481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:25Z","lastTransitionTime":"2026-01-27T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.949150 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.965731 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:25 crc kubenswrapper[4774]: I0127 00:07:25.986224 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047159 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.047311 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.059775 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060014 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.059981224 +0000 UTC m=+28.365758118 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.060073 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.060165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060314 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060390 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.060373115 +0000 UTC m=+28.366149999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060323 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.060567 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.06055349 +0000 UTC m=+28.366330604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.109120 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-g4cnl"] Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.109504 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.112551 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.112574 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.113198 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.114345 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.134031 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.149908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.149955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.149968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.149993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.150012 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.150444 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161149 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8f7b\" (UniqueName: \"kubernetes.io/projected/2c687c86-483c-433c-8a0c-a232c5d48974-kube-api-access-v8f7b\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161236 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c687c86-483c-433c-8a0c-a232c5d48974-host\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161268 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c687c86-483c-433c-8a0c-a232c5d48974-serviceca\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161301 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.161330 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161475 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161501 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161512 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161546 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161577 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.161558672 +0000 UTC m=+28.467335556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161583 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161613 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.161700 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:30.161675126 +0000 UTC m=+28.467452050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.170562 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.195800 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.217607 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.231615 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.249902 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.253277 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.262916 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8f7b\" (UniqueName: \"kubernetes.io/projected/2c687c86-483c-433c-8a0c-a232c5d48974-kube-api-access-v8f7b\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.263060 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c687c86-483c-433c-8a0c-a232c5d48974-host\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.263116 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c687c86-483c-433c-8a0c-a232c5d48974-serviceca\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.263248 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c687c86-483c-433c-8a0c-a232c5d48974-host\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.264303 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c687c86-483c-433c-8a0c-a232c5d48974-serviceca\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.271426 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.287097 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.288821 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8f7b\" (UniqueName: \"kubernetes.io/projected/2c687c86-483c-433c-8a0c-a232c5d48974-kube-api-access-v8f7b\") pod \"node-ca-g4cnl\" (UID: \"2c687c86-483c-433c-8a0c-a232c5d48974\") " pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.301236 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.309444 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 12:43:25.667104402 +0000 UTC Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.314888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.346436 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.356203 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.356338 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357084 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357323 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.357381 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.357554 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: E0127 00:07:26.357654 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.382120 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.422667 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g4cnl" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.432672 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: W0127 00:07:26.440367 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c687c86_483c_433c_8a0c_a232c5d48974.slice/crio-6e4deb6504b764f051c7702fe3788d5862be1fbae721213e490eb7b38e3653bf WatchSource:0}: Error finding container 6e4deb6504b764f051c7702fe3788d5862be1fbae721213e490eb7b38e3653bf: Status 404 returned error can't find the container with id 6e4deb6504b764f051c7702fe3788d5862be1fbae721213e490eb7b38e3653bf Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.460167 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.562540 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.568891 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerStarted","Data":"fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.571366 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4cnl" event={"ID":"2c687c86-483c-433c-8a0c-a232c5d48974","Type":"ContainerStarted","Data":"6e4deb6504b764f051c7702fe3788d5862be1fbae721213e490eb7b38e3653bf"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.576031 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.600374 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.649975 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.666896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.666959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.666971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.666989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.667001 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.679086 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.692669 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.704277 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.714852 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.727584 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.743743 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.768943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.768971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.768981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.769011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.769023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.783460 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.824886 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.863831 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871468 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.871478 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.906116 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.960141 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.973792 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:26Z","lastTransitionTime":"2026-01-27T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:26 crc kubenswrapper[4774]: I0127 00:07:26.984775 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.081345 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.183969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.184008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.184017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.184030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.184039 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.287286 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.310479 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 11:56:33.24036792 +0000 UTC Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.389729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.493619 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.584397 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g4cnl" event={"ID":"2c687c86-483c-433c-8a0c-a232c5d48974","Type":"ContainerStarted","Data":"8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.588802 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52" exitCode=0 Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.588891 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.596399 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.613515 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.640914 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.663004 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.684427 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.699918 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.718888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.736600 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.748594 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.760961 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.775758 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.795292 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.804569 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.814624 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.828488 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.855916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.872564 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.893623 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.907386 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:27Z","lastTransitionTime":"2026-01-27T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.912944 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.928936 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.953888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.967965 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:27 crc kubenswrapper[4774]: I0127 00:07:27.989904 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.009519 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.015623 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.035479 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.055392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.093615 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.112970 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.119559 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.127494 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.142560 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.162672 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.225667 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.312043 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 00:58:52.6490574 +0000 UTC Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.330163 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.356646 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.356758 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:28 crc kubenswrapper[4774]: E0127 00:07:28.356851 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.356910 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:28 crc kubenswrapper[4774]: E0127 00:07:28.357102 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:28 crc kubenswrapper[4774]: E0127 00:07:28.357270 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.433820 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.538572 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.599576 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.604366 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74" exitCode=0 Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.604651 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.644345 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.646900 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.647103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.647234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.647364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.647515 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.665173 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.680520 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.712748 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.733268 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.751571 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.755059 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.772408 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.793557 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.810056 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.810944 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.814954 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.822906 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.832161 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.858697 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.859621 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.878499 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.895815 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.912015 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.928938 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.942630 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.959573 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.963360 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:28Z","lastTransitionTime":"2026-01-27T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:28 crc kubenswrapper[4774]: I0127 00:07:28.985009 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.004139 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.022044 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.039934 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.055660 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.067314 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.074374 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.092075 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.124808 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.170217 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.173121 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.211172 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.256674 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.273896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.273973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.273993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.274026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.274048 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.288477 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.312824 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 00:49:39.694073368 +0000 UTC Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377403 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.377443 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.480830 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.585676 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.613985 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c" exitCode=0 Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.614093 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c"} Jan 27 00:07:29 crc kubenswrapper[4774]: E0127 00:07:29.625994 4774 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.650367 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.672284 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.688512 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.690756 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.727654 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.748261 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.762717 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.784110 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.791712 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.800379 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.816798 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.829539 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.841567 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.861262 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.878261 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.891622 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.894576 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.907883 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:29 crc kubenswrapper[4774]: I0127 00:07:29.998336 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:29Z","lastTransitionTime":"2026-01-27T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.100709 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.111101 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.111179 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.111230 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.111330 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.111371 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.111358124 +0000 UTC m=+36.417135008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.112022 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.11198607 +0000 UTC m=+36.417762954 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.112128 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.112175 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.112167885 +0000 UTC m=+36.417944769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.174471 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.175959 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.176267 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.204409 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.212571 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.212666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212891 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212906 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212950 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212970 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.212923 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.213016 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.213057 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.213030194 +0000 UTC m=+36.518807108 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.213083 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:38.213071065 +0000 UTC m=+36.518847989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.307638 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.313084 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 03:19:17.400911378 +0000 UTC Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.356500 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.356560 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.356626 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.356705 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.356796 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:30 crc kubenswrapper[4774]: E0127 00:07:30.357059 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.410423 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.514724 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.619192 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.625380 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.625913 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.625987 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.631116 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4" exitCode=0 Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.631736 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.646726 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.665566 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.670203 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.682967 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.713731 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.721729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.721782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.721800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.721989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.722021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.731565 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.754420 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.785411 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.803767 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.819378 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.832296 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.850415 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.869187 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.882151 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.894035 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.909850 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.927321 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.937227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:30Z","lastTransitionTime":"2026-01-27T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.948634 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.963345 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.981248 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:30 crc kubenswrapper[4774]: I0127 00:07:30.997322 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.014528 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.035327 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.040986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.041030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.041041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.041057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.041072 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.054892 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.072611 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.103828 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.120254 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.143728 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.146849 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.161739 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.175392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.191178 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.224729 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.246746 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.313700 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:28:15.383430351 +0000 UTC Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.349719 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.452763 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.556901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.556992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.557013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.557091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.557154 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.641195 4774 generic.go:334] "Generic (PLEG): container finished" podID="c578fd22-1672-431a-9914-66f55a0260bd" containerID="4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a" exitCode=0 Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.641254 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerDied","Data":"4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.642277 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661441 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.661511 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.685737 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.690346 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.712646 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.736606 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.757089 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.775945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.775994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.776005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.776023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.776036 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.776593 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.796303 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.813825 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.829703 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.845033 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.873468 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.884626 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.909546 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.939617 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.957369 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.980678 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:31 crc kubenswrapper[4774]: I0127 00:07:31.987712 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:31Z","lastTransitionTime":"2026-01-27T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.002653 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.014407 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.035159 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.050377 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.066902 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.084942 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.090325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.106819 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.126427 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.138731 4774 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.139308 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc/status\": read tcp 38.129.56.27:55304->38.129.56.27:6443: use of closed network connection" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.186196 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194323 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.194436 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.209877 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.222731 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.237153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.269525 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.296914 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.303404 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.314674 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:56:38.686782435 +0000 UTC Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.355951 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.356000 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.355951 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.356121 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.356261 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.356367 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.383430 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.396703 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.399652 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.424955 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.469630 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.502343 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.512160 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.545985 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.584111 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.604771 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.629164 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.631182 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.651040 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" event={"ID":"c578fd22-1672-431a-9914-66f55a0260bd","Type":"ContainerStarted","Data":"acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341"} Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.650827 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.656511 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.667574 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.673656 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.682765 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.703652 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.711933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.711989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.712007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.712030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.712046 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.712741 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.725433 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.732595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.732749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.733002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.733189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.733339 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.746543 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.750013 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: E0127 00:07:32.750278 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.752519 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.791507 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.827916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855703 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.855714 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.864655 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.904391 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.945580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.958227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:32Z","lastTransitionTime":"2026-01-27T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:32 crc kubenswrapper[4774]: I0127 00:07:32.985171 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.026916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.061981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.062029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.062039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.062062 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.062074 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.071607 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.109153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.150718 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.165132 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.188813 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.230375 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267501 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.267537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.282989 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.305401 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.314929 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:02:13.437670063 +0000 UTC Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.350142 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369870 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.369880 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.389219 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.431545 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.473171 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.479230 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.505655 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.577729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.658677 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/0.log" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.663821 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069" exitCode=1 Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.663916 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.665005 4774 scope.go:117] "RemoveContainer" containerID="b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.681839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.681941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.681964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.682066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.682103 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.703656 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.723132 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.748958 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.777160 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.786067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.786532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.786803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.787045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.787249 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.806833 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"00:07:32.927299 6027 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 00:07:32.927264 6027 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 00:07:32.927354 6027 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 00:07:32.927375 6027 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 00:07:32.927391 6027 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 00:07:32.927840 6027 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 00:07:32.927928 6027 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 00:07:32.927968 6027 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 00:07:32.927990 6027 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 00:07:32.927998 6027 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:07:32.928023 6027 factory.go:656] Stopping watch factory\\\\nI0127 00:07:32.928047 6027 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:07:32.928053 6027 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 00:07:32.928076 6027 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0127 00:07:32.928091 6027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0127 00:07:32.928099 6027 handler.go:208] Removed *v1.NetworkPolicy\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.827339 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.847879 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.873527 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890443 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.890990 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.903468 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.946837 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.994448 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:33Z","lastTransitionTime":"2026-01-27T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:33 crc kubenswrapper[4774]: I0127 00:07:33.996510 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:33Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.030598 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.075997 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.098955 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.106306 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.202281 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.305677 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.316147 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 14:28:18.936091879 +0000 UTC Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.360483 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:34 crc kubenswrapper[4774]: E0127 00:07:34.360674 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.361331 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:34 crc kubenswrapper[4774]: E0127 00:07:34.361446 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.361532 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:34 crc kubenswrapper[4774]: E0127 00:07:34.361613 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.409363 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.512535 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.614970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.615011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.615023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.615039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.615049 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.669088 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/0.log" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.671255 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.683256 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.698326 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.713389 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.716982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.717026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.717037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.717053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.717063 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.728489 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.743281 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.758977 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.775508 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.793928 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.806384 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.819926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.819987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.819998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.820014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.820026 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.824216 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.836301 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.848264 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.860830 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.873012 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.885359 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.901032 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"00:07:32.927299 6027 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 00:07:32.927264 6027 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 00:07:32.927354 6027 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 00:07:32.927375 6027 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 00:07:32.927391 6027 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 00:07:32.927840 6027 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 00:07:32.927928 6027 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 00:07:32.927968 6027 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 00:07:32.927990 6027 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 00:07:32.927998 6027 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:07:32.928023 6027 factory.go:656] Stopping watch factory\\\\nI0127 00:07:32.928047 6027 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:07:32.928053 6027 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 00:07:32.928076 6027 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0127 00:07:32.928091 6027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0127 00:07:32.928099 6027 handler.go:208] Removed *v1.NetworkPolicy\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:34 crc kubenswrapper[4774]: I0127 00:07:34.922795 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:34Z","lastTransitionTime":"2026-01-27T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.025930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.025981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.025993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.026007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.026017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.128982 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.233172 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.316385 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:43:12.350698798 +0000 UTC Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.336362 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.438666 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.542309 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.644266 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.747939 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.747991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.748001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.748018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.748029 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.851694 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.954959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.955008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.955023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.955044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:35 crc kubenswrapper[4774]: I0127 00:07:35.955058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:35Z","lastTransitionTime":"2026-01-27T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.058672 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.162822 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.265685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.316614 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 23:28:31.593737229 +0000 UTC Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.356558 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.356712 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:36 crc kubenswrapper[4774]: E0127 00:07:36.356849 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.356717 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:36 crc kubenswrapper[4774]: E0127 00:07:36.357085 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:36 crc kubenswrapper[4774]: E0127 00:07:36.357306 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.369713 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.473974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.474064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.474088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.474123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.474149 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.577919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.578009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.578036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.578072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.578101 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.681839 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.684521 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/1.log" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.685517 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/0.log" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.689719 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" exitCode=1 Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.689789 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.689894 4774 scope.go:117] "RemoveContainer" containerID="b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.690995 4774 scope.go:117] "RemoveContainer" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" Jan 27 00:07:36 crc kubenswrapper[4774]: E0127 00:07:36.691385 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.722768 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.740917 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.762270 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.780214 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.785439 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.802029 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.824043 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.829844 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6"] Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.830383 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.834236 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.834668 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.847453 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.865732 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.888557 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.899016 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911557 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911678 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnp67\" (UniqueName: \"kubernetes.io/projected/936f5a31-13bf-456e-83d8-1aee977d2af5-kube-api-access-qnp67\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/936f5a31-13bf-456e-83d8-1aee977d2af5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911850 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.911799 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.927135 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.944981 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.963602 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.986360 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:36Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:36 crc kubenswrapper[4774]: I0127 00:07:36.991315 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:36Z","lastTransitionTime":"2026-01-27T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013091 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnp67\" (UniqueName: \"kubernetes.io/projected/936f5a31-13bf-456e-83d8-1aee977d2af5-kube-api-access-qnp67\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013143 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/936f5a31-13bf-456e-83d8-1aee977d2af5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013183 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013234 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.013749 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.014980 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/936f5a31-13bf-456e-83d8-1aee977d2af5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.019804 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"00:07:32.927299 6027 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 00:07:32.927264 6027 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 00:07:32.927354 6027 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 00:07:32.927375 6027 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 00:07:32.927391 6027 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 00:07:32.927840 6027 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 00:07:32.927928 6027 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 00:07:32.927968 6027 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 00:07:32.927990 6027 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 00:07:32.927998 6027 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:07:32.928023 6027 factory.go:656] Stopping watch factory\\\\nI0127 00:07:32.928047 6027 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:07:32.928053 6027 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 00:07:32.928076 6027 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0127 00:07:32.928091 6027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0127 00:07:32.928099 6027 handler.go:208] Removed *v1.NetworkPolicy\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.028767 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/936f5a31-13bf-456e-83d8-1aee977d2af5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.044569 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.048972 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnp67\" (UniqueName: \"kubernetes.io/projected/936f5a31-13bf-456e-83d8-1aee977d2af5-kube-api-access-qnp67\") pod \"ovnkube-control-plane-749d76644c-wbzx6\" (UID: \"936f5a31-13bf-456e-83d8-1aee977d2af5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.061133 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.078309 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095403 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.095453 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.096567 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.111070 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.150200 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.151174 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0544725efe182e4a0035123593af6f38ef88360f6f7137fe264d6e2e05e8069\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"message\\\":\\\"00:07:32.927299 6027 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 00:07:32.927264 6027 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 00:07:32.927354 6027 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 00:07:32.927375 6027 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 00:07:32.927391 6027 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 00:07:32.927840 6027 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 00:07:32.927928 6027 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 00:07:32.927968 6027 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 00:07:32.927990 6027 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 00:07:32.927998 6027 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:07:32.928023 6027 factory.go:656] Stopping watch factory\\\\nI0127 00:07:32.928047 6027 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:07:32.928053 6027 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 00:07:32.928076 6027 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0127 00:07:32.928091 6027 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0127 00:07:32.928099 6027 handler.go:208] Removed *v1.NetworkPolicy\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.182532 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.198613 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.204621 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.220011 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.234792 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.251117 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.269953 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.283257 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.297296 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.301636 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.314492 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.317305 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 22:06:14.174755994 +0000 UTC Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.336435 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.405740 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.508728 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.612985 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.698238 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/1.log" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.701770 4774 scope.go:117] "RemoveContainer" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" Jan 27 00:07:37 crc kubenswrapper[4774]: E0127 00:07:37.702081 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.704138 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" event={"ID":"936f5a31-13bf-456e-83d8-1aee977d2af5","Type":"ContainerStarted","Data":"feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.704204 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" event={"ID":"936f5a31-13bf-456e-83d8-1aee977d2af5","Type":"ContainerStarted","Data":"5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.704223 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" event={"ID":"936f5a31-13bf-456e-83d8-1aee977d2af5","Type":"ContainerStarted","Data":"986726594a406074d4dcdec60185f20bca763efacdbc4316b0a539ddc05c1c0e"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.715957 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.724536 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.744267 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.757512 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.773727 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.789390 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.805801 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.817978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.818226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.818241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.818263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.818279 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.823535 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.849272 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.866483 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.881211 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.893294 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.905619 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.921476 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:37Z","lastTransitionTime":"2026-01-27T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.922808 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.937153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.963108 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:37 crc kubenswrapper[4774]: I0127 00:07:37.986409 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:37Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.010713 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.025126 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.044932 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.067005 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.083979 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.104083 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.122443 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.127371 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.128116 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128318 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.128288733 +0000 UTC m=+52.434065757 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.128372 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.128489 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128529 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128599 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.128581331 +0000 UTC m=+52.434358215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128604 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.128648 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.128639282 +0000 UTC m=+52.434416166 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.141656 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.158824 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.176821 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.206166 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.224788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.229447 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.229532 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229721 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229762 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229775 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229811 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229833 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229783 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.229942 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.229918502 +0000 UTC m=+52.535695426 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.230194 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.230096367 +0000 UTC m=+52.535873271 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.230424 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.245217 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.262694 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.282519 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.306793 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.317481 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 12:24:22.063748476 +0000 UTC Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.333807 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.336416 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.355955 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.355966 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.356068 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.357006 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.357322 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.357420 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.393272 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6djzf"] Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.393935 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.394024 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.412446 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.433316 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.436892 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.436954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.436973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.436998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.437013 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.455771 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.478937 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.499744 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.515152 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.528989 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.533324 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.533452 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtfjl\" (UniqueName: \"kubernetes.io/projected/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-kube-api-access-xtfjl\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.540921 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.540966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.540981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.541012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.541025 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.549239 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.562667 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.575638 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.587764 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.603310 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.619148 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.634791 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.634850 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtfjl\" (UniqueName: \"kubernetes.io/projected/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-kube-api-access-xtfjl\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.635090 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: E0127 00:07:38.635248 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:39.135208128 +0000 UTC m=+37.440985052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.637298 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.645904 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.650458 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.658540 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtfjl\" (UniqueName: \"kubernetes.io/projected/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-kube-api-access-xtfjl\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.668767 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.695638 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:38Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.748538 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.851985 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:38 crc kubenswrapper[4774]: I0127 00:07:38.955688 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:38Z","lastTransitionTime":"2026-01-27T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.059629 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.140698 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:39 crc kubenswrapper[4774]: E0127 00:07:39.141099 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:39 crc kubenswrapper[4774]: E0127 00:07:39.141238 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:40.141199877 +0000 UTC m=+38.446976801 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.163486 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.266686 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.317940 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 13:59:45.028051748 +0000 UTC Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369589 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.369609 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.472922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.472995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.473017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.473049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.473074 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576457 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.576528 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.679685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.783666 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.888245 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:39 crc kubenswrapper[4774]: I0127 00:07:39.992350 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:39Z","lastTransitionTime":"2026-01-27T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.096755 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.155098 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.155341 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.155414 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:42.155396445 +0000 UTC m=+40.461173339 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.200164 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.304524 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.319062 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:28:19.46561291 +0000 UTC Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.356513 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.356693 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.356997 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.356975 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.357066 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.357176 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.357465 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:40 crc kubenswrapper[4774]: E0127 00:07:40.357646 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.408446 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.513684 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.617532 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.723782 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.827534 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:40 crc kubenswrapper[4774]: I0127 00:07:40.931638 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:40Z","lastTransitionTime":"2026-01-27T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.035166 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.138825 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.241972 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.319843 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 19:15:26.44868528 +0000 UTC Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.346293 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.449893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.449976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.449996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.450027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.450045 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.553908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.553995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.554024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.554057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.554079 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.658415 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.762331 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.865810 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:41 crc kubenswrapper[4774]: I0127 00:07:41.969461 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:41Z","lastTransitionTime":"2026-01-27T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073079 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.073210 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.176917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.176990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.177013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.177064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.177221 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.179127 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.179449 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.179596 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:46.179554729 +0000 UTC m=+44.485331843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.280986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.281119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.281143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.281173 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.281195 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.321108 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 13:45:53.721798109 +0000 UTC Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.356077 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.356234 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.356370 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.356437 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.356703 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.356924 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.357150 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.357554 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.378276 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.386527 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.395164 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.415383 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.446626 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.482033 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.495150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.504651 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.529245 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.547263 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.565597 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.583773 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.598942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.599027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.599087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.599119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.599182 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.601588 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.618715 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.635227 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.657092 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.672261 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.686667 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.703169 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.720198 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805821 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.805842 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.818741 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.840725 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.846692 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.873374 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.880841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.880962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.881017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.881046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.881099 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.905813 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.912430 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.934087 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.941293 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.963940 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:42 crc kubenswrapper[4774]: E0127 00:07:42.964171 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.966909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.966986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.967007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.967046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:42 crc kubenswrapper[4774]: I0127 00:07:42.967073 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:42Z","lastTransitionTime":"2026-01-27T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.100420 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.204772 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.308853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.321830 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 04:15:43.27442989 +0000 UTC Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.412647 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.515466 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.618446 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.722393 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.826564 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.929993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.930069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.930089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.930118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:43 crc kubenswrapper[4774]: I0127 00:07:43.930136 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:43Z","lastTransitionTime":"2026-01-27T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.033464 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137380 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.137429 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.241154 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.323042 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 00:24:04.814327586 +0000 UTC Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.345143 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.357162 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.357307 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:44 crc kubenswrapper[4774]: E0127 00:07:44.357446 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.357476 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.357496 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:44 crc kubenswrapper[4774]: E0127 00:07:44.357693 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:44 crc kubenswrapper[4774]: E0127 00:07:44.357846 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:44 crc kubenswrapper[4774]: E0127 00:07:44.357990 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.449358 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.552281 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.656433 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.759852 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.863210 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:44 crc kubenswrapper[4774]: I0127 00:07:44.966904 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:44Z","lastTransitionTime":"2026-01-27T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.070736 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.174694 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.277976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.278038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.278053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.278075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.278087 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.323678 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 18:17:05.992677539 +0000 UTC Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.357057 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.380256 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.483325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.591481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.592287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.592303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.592514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.592526 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.697890 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.750982 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.753771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.754554 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.775208 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.795580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.801199 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.811930 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.829207 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.859915 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.877794 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.893753 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.904797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:45Z","lastTransitionTime":"2026-01-27T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.927900 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.951485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.965979 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:45 crc kubenswrapper[4774]: I0127 00:07:45.986227 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:45Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.007794 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.011786 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.030169 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.050333 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.070287 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.085517 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.104111 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.110994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.111036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.111058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.111092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.111117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.213935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.213982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.213998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.214023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.214041 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.233577 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.233849 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.233991 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:07:54.233957781 +0000 UTC m=+52.539734695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.318205 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.324230 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 14:37:47.338057843 +0000 UTC Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.356197 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.356216 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.356312 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.356329 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.356523 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.356742 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.356816 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:46 crc kubenswrapper[4774]: E0127 00:07:46.356924 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421153 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.421310 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.525483 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.628140 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.731596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.731976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.732010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.732044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.732067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.835267 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:46 crc kubenswrapper[4774]: I0127 00:07:46.939502 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:46Z","lastTransitionTime":"2026-01-27T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.042839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.042951 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.042973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.043007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.043029 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.146661 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.249792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.249924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.249954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.249988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.250017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.324678 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 16:20:38.767977594 +0000 UTC Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.354450 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.458623 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562278 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.562300 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.665946 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.769457 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.872694 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976703 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:47 crc kubenswrapper[4774]: I0127 00:07:47.976728 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:47Z","lastTransitionTime":"2026-01-27T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.080576 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.183808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.183947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.183972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.184009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.184037 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.287349 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.325627 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 22:01:25.560990453 +0000 UTC Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.356462 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.356611 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:48 crc kubenswrapper[4774]: E0127 00:07:48.356664 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.356758 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:48 crc kubenswrapper[4774]: E0127 00:07:48.356849 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.356971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:48 crc kubenswrapper[4774]: E0127 00:07:48.357095 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:48 crc kubenswrapper[4774]: E0127 00:07:48.357238 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.390946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.391022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.391049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.391087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.391114 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494173 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.494217 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.597565 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702246 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.702376 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.805718 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.908989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.909065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.909090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.909124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:48 crc kubenswrapper[4774]: I0127 00:07:48.909153 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:48Z","lastTransitionTime":"2026-01-27T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.012654 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.115515 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.218452 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.321727 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.326351 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 18:20:45.149978856 +0000 UTC Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.425512 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532703 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.532750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.636997 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.740969 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.844628 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:49 crc kubenswrapper[4774]: I0127 00:07:49.947505 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:49Z","lastTransitionTime":"2026-01-27T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.051444 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.155791 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259646 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.259670 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.326942 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 07:10:15.389749797 +0000 UTC Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.356543 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.356630 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.356666 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.356564 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:50 crc kubenswrapper[4774]: E0127 00:07:50.356800 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:50 crc kubenswrapper[4774]: E0127 00:07:50.357029 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:50 crc kubenswrapper[4774]: E0127 00:07:50.357189 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:50 crc kubenswrapper[4774]: E0127 00:07:50.357373 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.362642 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.466657 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.570301 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.674196 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.778196 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.881911 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:50 crc kubenswrapper[4774]: I0127 00:07:50.985912 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:50Z","lastTransitionTime":"2026-01-27T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.089546 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.193544 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.297548 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.327387 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 19:52:23.569776092 +0000 UTC Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.401653 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.504620 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.608459 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.711129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.711466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.711594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.711821 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.712047 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.814988 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:51 crc kubenswrapper[4774]: I0127 00:07:51.918571 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:51Z","lastTransitionTime":"2026-01-27T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.021975 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.125959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.126061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.126081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.126110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.126131 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.230668 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.328148 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 01:12:08.43469906 +0000 UTC Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.334481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.356116 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.356169 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.356223 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.356372 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:52 crc kubenswrapper[4774]: E0127 00:07:52.357802 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:52 crc kubenswrapper[4774]: E0127 00:07:52.358178 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:52 crc kubenswrapper[4774]: E0127 00:07:52.358445 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:52 crc kubenswrapper[4774]: E0127 00:07:52.358604 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.363069 4774 scope.go:117] "RemoveContainer" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.396025 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.414295 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.445269 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.450806 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.504954 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.518847 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.534117 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.547347 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.548264 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.564703 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.582481 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.595596 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.609718 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.623610 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.637586 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.651642 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.653128 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.667968 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.681767 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.697024 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.756936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.756995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.757010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.757041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.757055 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.784392 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/1.log" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.786113 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.787361 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.802609 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.823106 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.839375 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.855268 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860471 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.860559 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.877717 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.896315 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.913703 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.928286 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.942922 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.961207 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.962938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.962972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.962981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.962997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.963010 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:52Z","lastTransitionTime":"2026-01-27T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.972269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:52 crc kubenswrapper[4774]: I0127 00:07:52.985540 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:52Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.003078 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.013484 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.026428 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.029732 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.033348 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.044555 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.051017 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.054526 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.060317 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.069734 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.073915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.073971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.073987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.074011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.074033 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.075328 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.088108 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.092297 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.110027 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.110206 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.112744 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216472 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.216508 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.320630 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.328565 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:42:23.361095693 +0000 UTC Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.423697 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.526420 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629801 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.629823 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.732485 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.792653 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/2.log" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.793720 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/1.log" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.798247 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" exitCode=1 Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.798335 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.798445 4774 scope.go:117] "RemoveContainer" containerID="b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.799807 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:07:53 crc kubenswrapper[4774]: E0127 00:07:53.800198 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.828709 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.834787 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.848617 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.866727 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.880142 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.900051 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.916392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.933740 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.938848 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:53Z","lastTransitionTime":"2026-01-27T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.953127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.971045 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:53 crc kubenswrapper[4774]: I0127 00:07:53.985173 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:53Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.007541 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.027499 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.043439 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.044081 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.063237 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.087438 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.100335 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.114540 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.135107 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.135279 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.135406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135510 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.135408613 +0000 UTC m=+84.441185507 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135582 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135612 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135718 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.13569313 +0000 UTC m=+84.441470024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.135771 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.135730941 +0000 UTC m=+84.441508005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.146227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.237205 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.237347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.237403 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237684 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237811 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237850 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237913 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237953 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:10.237903956 +0000 UTC m=+68.543680870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.237702 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.238008 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.237975328 +0000 UTC m=+84.543752402 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.238102 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.238151 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.238307 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:26.238257955 +0000 UTC m=+84.544035029 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249246 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.249280 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.329356 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 20:54:44.987034761 +0000 UTC Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.352938 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.356220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.356268 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.356287 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.356443 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.356550 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.356756 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.357395 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.357584 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.456917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.456997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.457063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.457705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.457769 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.561537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664597 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.664614 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.767922 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.805327 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/2.log" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.810526 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.812505 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:07:54 crc kubenswrapper[4774]: E0127 00:07:54.812787 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.825131 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.833509 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.857580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.871252 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.876487 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.901625 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.927447 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3c41ceaf68ce40b4208c2dc3188ebc23b6a14b26e48bfb12b369fd743199ad5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:35Z\\\",\\\"message\\\":\\\":map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0127 00:07:35.802798 6188 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.948581 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.968242 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.974907 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:54Z","lastTransitionTime":"2026-01-27T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:54 crc kubenswrapper[4774]: I0127 00:07:54.987992 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.002098 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.018316 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.040473 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.056896 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.074202 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.077094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.078470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.078515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.078542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.078561 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.091957 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.115319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.127934 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.139560 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.154030 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.166985 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.181296 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.186541 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.220659 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.238094 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.260403 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.276988 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.285930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.286011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.286027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.286077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.286095 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.291521 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.309306 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.324243 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.329813 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 22:46:03.845419769 +0000 UTC Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.339469 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.355063 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.371319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.388152 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.389996 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.406179 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.420849 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.444788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.459206 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:55Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.493689 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.597651 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.701306 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804455 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.804516 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909597 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:55 crc kubenswrapper[4774]: I0127 00:07:55.909627 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:55Z","lastTransitionTime":"2026-01-27T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015106 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.015155 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.118312 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.221696 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.324887 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.330483 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 16:45:31.613671557 +0000 UTC Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.356163 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.356272 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.356282 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.356289 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:56 crc kubenswrapper[4774]: E0127 00:07:56.356368 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:56 crc kubenswrapper[4774]: E0127 00:07:56.356531 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:56 crc kubenswrapper[4774]: E0127 00:07:56.356690 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:56 crc kubenswrapper[4774]: E0127 00:07:56.356800 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.427434 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.530742 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.633742 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.708181 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.726124 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.735958 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.744528 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.760336 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.773971 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.787605 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.800716 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.810185 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.823351 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.838817 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.840685 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.855896 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.869348 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.894052 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.905099 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.919753 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.934524 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.941931 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:56Z","lastTransitionTime":"2026-01-27T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.954644 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.978681 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:56 crc kubenswrapper[4774]: I0127 00:07:56.996062 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:07:56Z is after 2025-08-24T17:21:41Z" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.044928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.045007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.045029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.045053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.045071 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.147830 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251529 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.251550 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.331597 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 08:32:08.645048274 +0000 UTC Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.354795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.354948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.354974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.355012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.355038 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.458713 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.562614 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.669574 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.771850 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.874411 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.977909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.977984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.978004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.978031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:57 crc kubenswrapper[4774]: I0127 00:07:57.978050 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:57Z","lastTransitionTime":"2026-01-27T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.081464 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.184785 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.286966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.287014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.287027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.287047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.287059 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.332647 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 14:43:06.411124686 +0000 UTC Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.356113 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.356157 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.356220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.356122 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:07:58 crc kubenswrapper[4774]: E0127 00:07:58.356287 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:07:58 crc kubenswrapper[4774]: E0127 00:07:58.356434 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:07:58 crc kubenswrapper[4774]: E0127 00:07:58.356522 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:07:58 crc kubenswrapper[4774]: E0127 00:07:58.356513 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.389476 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.491593 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.594147 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.696593 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.798730 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:58 crc kubenswrapper[4774]: I0127 00:07:58.901775 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:58Z","lastTransitionTime":"2026-01-27T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.004630 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.106998 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.209634 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.312633 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.333595 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 21:05:13.868922466 +0000 UTC Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.414919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.415161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.415220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.415282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.415340 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.518566 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.621283 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.723843 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.826426 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.928987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.929376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.929531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.929678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:07:59 crc kubenswrapper[4774]: I0127 00:07:59.929832 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:07:59Z","lastTransitionTime":"2026-01-27T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.033446 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.136299 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.238543 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.334374 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 11:23:33.352653196 +0000 UTC Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.342567 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.355897 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.356012 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:00 crc kubenswrapper[4774]: E0127 00:08:00.356047 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:00 crc kubenswrapper[4774]: E0127 00:08:00.356204 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.356298 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:00 crc kubenswrapper[4774]: E0127 00:08:00.356431 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.356550 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:00 crc kubenswrapper[4774]: E0127 00:08:00.356637 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.445661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.446037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.446127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.446219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.446310 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.549750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.653253 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.756959 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.860540 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:00 crc kubenswrapper[4774]: I0127 00:08:00.963742 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:00Z","lastTransitionTime":"2026-01-27T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.066756 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.169441 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.272955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.273307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.273439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.273558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.273640 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.335179 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 03:05:08.33281967 +0000 UTC Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.377306 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.479964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.480042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.480067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.480097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.480120 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.583843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.583950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.583971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.583999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.584017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.686900 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.687320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.687470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.687631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.687842 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.791281 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.893926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.893980 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.893989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.894010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.894021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:01 crc kubenswrapper[4774]: I0127 00:08:01.996866 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:01Z","lastTransitionTime":"2026-01-27T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.099142 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.202215 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305121 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.305257 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.336510 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 15:47:16.857519182 +0000 UTC Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.355828 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.355905 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.355961 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:02 crc kubenswrapper[4774]: E0127 00:08:02.356080 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.356129 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:02 crc kubenswrapper[4774]: E0127 00:08:02.356286 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:02 crc kubenswrapper[4774]: E0127 00:08:02.356432 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:02 crc kubenswrapper[4774]: E0127 00:08:02.356613 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.380161 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.393191 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.406362 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.407779 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.417532 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.433756 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.458004 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.472078 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.484206 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.495741 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.505217 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.510352 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.517551 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.536221 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.550292 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.562911 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.573312 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.585093 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.597320 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.605946 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.613162 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.716117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.818814 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:02 crc kubenswrapper[4774]: I0127 00:08:02.922205 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:02Z","lastTransitionTime":"2026-01-27T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.024962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.025000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.025008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.025025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.025036 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.127880 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.127958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.127977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.127992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.128023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.231633 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.334966 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.337025 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 17:48:54.960392019 +0000 UTC Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.444224 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.446367 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.470456 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.476429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.476656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.476797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.477033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.477186 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.499150 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.506245 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.528792 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535181 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.535227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.558402 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.563980 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.586481 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:03Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:03 crc kubenswrapper[4774]: E0127 00:08:03.586617 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.589667 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.692946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.692996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.693007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.693024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.693035 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795457 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.795514 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898839 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:03 crc kubenswrapper[4774]: I0127 00:08:03.898928 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:03Z","lastTransitionTime":"2026-01-27T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.001412 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.104424 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.207565 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.311388 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.337459 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 02:16:16.964324352 +0000 UTC Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.355987 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.356106 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.355995 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.355995 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:04 crc kubenswrapper[4774]: E0127 00:08:04.356236 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:04 crc kubenswrapper[4774]: E0127 00:08:04.356386 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:04 crc kubenswrapper[4774]: E0127 00:08:04.356566 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:04 crc kubenswrapper[4774]: E0127 00:08:04.356737 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.413987 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.516710 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.620823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.620942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.620962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.621000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.621021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.725932 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.829620 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:04 crc kubenswrapper[4774]: I0127 00:08:04.934433 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:04Z","lastTransitionTime":"2026-01-27T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.037840 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.141391 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.246888 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.338545 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 21:48:53.290432675 +0000 UTC Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.350150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.357426 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:08:05 crc kubenswrapper[4774]: E0127 00:08:05.357647 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.452914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.452961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.452974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.452990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.453000 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555501 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.555557 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.658644 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.761957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.762055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.762082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.762108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.762129 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.865289 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.968983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.969053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.969072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.969100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:05 crc kubenswrapper[4774]: I0127 00:08:05.969122 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:05Z","lastTransitionTime":"2026-01-27T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.073683 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.177280 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280931 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.280980 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.339189 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 13:56:19.857443462 +0000 UTC Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.356700 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.356713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.356713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.356891 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:06 crc kubenswrapper[4774]: E0127 00:08:06.357134 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:06 crc kubenswrapper[4774]: E0127 00:08:06.357273 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:06 crc kubenswrapper[4774]: E0127 00:08:06.357479 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:06 crc kubenswrapper[4774]: E0127 00:08:06.357669 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.383903 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487305 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.487374 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.590936 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693181 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.693243 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.796200 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.898985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.899057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.899075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.899103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:06 crc kubenswrapper[4774]: I0127 00:08:06.899122 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:06Z","lastTransitionTime":"2026-01-27T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.001680 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104371 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104403 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.104427 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.207638 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.310465 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.339331 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 19:34:28.846530598 +0000 UTC Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.413983 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.517550 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.620998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.621043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.621053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.621070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.621082 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.724285 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.827571 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935686 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:07 crc kubenswrapper[4774]: I0127 00:08:07.935730 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:07Z","lastTransitionTime":"2026-01-27T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.038703 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.140998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.141047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.141064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.141086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.141101 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.244478 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.339932 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 11:24:35.655453313 +0000 UTC Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.347552 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.355962 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.355983 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.356049 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.356110 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:08 crc kubenswrapper[4774]: E0127 00:08:08.356100 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:08 crc kubenswrapper[4774]: E0127 00:08:08.356202 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:08 crc kubenswrapper[4774]: E0127 00:08:08.356321 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:08 crc kubenswrapper[4774]: E0127 00:08:08.356551 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.450875 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.554590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.657949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.657996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.658021 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.658040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.658051 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.761807 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.865421 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.967966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.968008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.968022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.968040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:08 crc kubenswrapper[4774]: I0127 00:08:08.968052 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:08Z","lastTransitionTime":"2026-01-27T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.070995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.071044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.071090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.071111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.071123 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.173914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.173966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.173977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.173993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.174001 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.276962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.276999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.277009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.277025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.277035 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.340550 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:01:47.802081129 +0000 UTC Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.379101 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.481909 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.584374 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.686998 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.789825 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.892573 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:09 crc kubenswrapper[4774]: I0127 00:08:09.995126 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:09Z","lastTransitionTime":"2026-01-27T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.098753 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.201428 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.308960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.309019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.309034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.309058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.309075 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.334954 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.335112 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.335245 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:08:42.335220546 +0000 UTC m=+100.640997430 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.341333 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 05:27:09.31000823 +0000 UTC Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.356943 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.356999 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.357027 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.357149 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.357237 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.357286 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.357317 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:10 crc kubenswrapper[4774]: E0127 00:08:10.357453 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.412527 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.515312 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618246 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.618399 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.725441 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.828927 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932085 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932117 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:10 crc kubenswrapper[4774]: I0127 00:08:10.932129 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:10Z","lastTransitionTime":"2026-01-27T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.035121 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139454 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.139483 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.242530 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.342274 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 04:13:30.260487847 +0000 UTC Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.345205 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.449641 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.552933 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.656300 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759380 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.759410 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.862401 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.876819 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/0.log" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.876899 4774 generic.go:334] "Generic (PLEG): container finished" podID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" containerID="62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b" exitCode=1 Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.876938 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerDied","Data":"62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.877393 4774 scope.go:117] "RemoveContainer" containerID="62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.895737 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.914936 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.932097 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.960163 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965370 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.965393 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:11Z","lastTransitionTime":"2026-01-27T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.977746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:11 crc kubenswrapper[4774]: I0127 00:08:11.992818 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:11Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.009440 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.021466 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.032666 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.042153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.054943 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.068157 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.069899 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.081285 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.094110 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.107970 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.119127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.145230 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.158382 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.170678 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.273953 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.343911 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 02:37:16.977411306 +0000 UTC Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.356358 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.356373 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:12 crc kubenswrapper[4774]: E0127 00:08:12.356496 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.356549 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.356585 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:12 crc kubenswrapper[4774]: E0127 00:08:12.356679 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:12 crc kubenswrapper[4774]: E0127 00:08:12.356698 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:12 crc kubenswrapper[4774]: E0127 00:08:12.356742 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.368287 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.377340 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.378208 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.394434 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.407008 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.420788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.431271 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.449657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.470735 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.480374 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.483055 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.494469 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.505577 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.516338 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.528140 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.540121 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.552236 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.562128 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.573989 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.585628 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.587000 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.687743 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.791186 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.883758 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/0.log" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.883847 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.895978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.920466 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.929973 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.940448 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.950685 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.963795 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.983285 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.996459 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:12 crc kubenswrapper[4774]: I0127 00:08:12.998367 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:12Z","lastTransitionTime":"2026-01-27T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.008966 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.021201 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.041211 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.076491 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.089647 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.100797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.101839 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.112838 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.122121 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.133269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.144819 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.154685 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.250265 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.344245 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 04:26:40.392832452 +0000 UTC Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.353787 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457298 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.457316 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.561294 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.660991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.679461 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.685507 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.707201 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.711520 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.726164 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.730928 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.743805 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.749279 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.762848 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:13 crc kubenswrapper[4774]: E0127 00:08:13.763017 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.764896 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868380 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.868418 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:13 crc kubenswrapper[4774]: I0127 00:08:13.971424 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:13Z","lastTransitionTime":"2026-01-27T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.074646 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.177830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.177901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.177913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.177929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.178052 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.281797 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.345049 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 00:39:39.921486423 +0000 UTC Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.357102 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.357220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:14 crc kubenswrapper[4774]: E0127 00:08:14.357307 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.357119 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:14 crc kubenswrapper[4774]: E0127 00:08:14.357399 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:14 crc kubenswrapper[4774]: E0127 00:08:14.357443 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.357102 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:14 crc kubenswrapper[4774]: E0127 00:08:14.357522 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.384972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.385023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.385036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.385066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.385084 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.488493 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.590601 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.693515 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.795987 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:14 crc kubenswrapper[4774]: I0127 00:08:14.898268 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:14Z","lastTransitionTime":"2026-01-27T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000892 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000932 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.000955 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.103235 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.205931 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.308941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.308988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.308997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.309012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.309023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.345283 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:50:15.169470141 +0000 UTC Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413308 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.413320 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516278 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.516313 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.619259 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.722603 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.825592 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:15 crc kubenswrapper[4774]: I0127 00:08:15.927978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:15Z","lastTransitionTime":"2026-01-27T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.030325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.133149 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.236203 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.338407 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.345360 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 18:10:12.204833182 +0000 UTC Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.355907 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.355969 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.355971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:16 crc kubenswrapper[4774]: E0127 00:08:16.356058 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.356108 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:16 crc kubenswrapper[4774]: E0127 00:08:16.356210 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:16 crc kubenswrapper[4774]: E0127 00:08:16.356343 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:16 crc kubenswrapper[4774]: E0127 00:08:16.356457 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.441200 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.544429 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.655894 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.758375 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.859988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.860045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.860057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.860071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.860083 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:16 crc kubenswrapper[4774]: I0127 00:08:16.962674 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:16Z","lastTransitionTime":"2026-01-27T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.065390 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.167631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.168076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.168157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.168275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.168357 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.271794 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.346468 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 09:50:17.486341569 +0000 UTC Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.374826 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.477695 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.580630 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.683973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.684046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.684063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.684082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.684095 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.786913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.786987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.787009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.787039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.787062 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.890800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.891277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.891432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.891564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.891750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.994417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.994916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.995209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.995416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:17 crc kubenswrapper[4774]: I0127 00:08:17.995610 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:17Z","lastTransitionTime":"2026-01-27T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.098733 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.202946 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306548 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.306590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.346915 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 05:54:32.336339351 +0000 UTC Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.356366 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.356390 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.356470 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:18 crc kubenswrapper[4774]: E0127 00:08:18.356803 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.356844 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:18 crc kubenswrapper[4774]: E0127 00:08:18.357030 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:18 crc kubenswrapper[4774]: E0127 00:08:18.357202 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:18 crc kubenswrapper[4774]: E0127 00:08:18.357327 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.358607 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.409265 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513452 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.513503 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.615789 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718371 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.718400 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.820979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.821018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.821031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.821047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.821058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.906185 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/2.log" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.913436 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.913948 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.922853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:18Z","lastTransitionTime":"2026-01-27T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.927802 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.939610 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.948712 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.959896 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.978317 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:18 crc kubenswrapper[4774]: I0127 00:08:18.990494 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:18Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.003458 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.017101 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.025890 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.035379 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.047676 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.061670 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.075826 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.086465 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.097977 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.108280 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.119188 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.127970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.128014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.128028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.128046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.128060 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.129762 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.139288 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.231199 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334106 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.334140 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.347445 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 20:11:07.194924017 +0000 UTC Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.436881 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.539297 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.641394 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744460 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.744478 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846459 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.846563 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.917235 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/3.log" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.917764 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/2.log" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.920070 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" exitCode=1 Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.920111 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.920163 4774 scope.go:117] "RemoveContainer" containerID="5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.920964 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:08:19 crc kubenswrapper[4774]: E0127 00:08:19.921218 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.941090 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.948686 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:19Z","lastTransitionTime":"2026-01-27T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.956830 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.966603 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.983730 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:19 crc kubenswrapper[4774]: I0127 00:08:19.998742 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:19Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.013307 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.030154 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.052364 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f4a4151bc8222a677603cb8394f5d16571afc226684101fa8d090dec9d8a1ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:07:53Z\\\",\\\"message\\\":\\\"ervices_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293219 6404 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nI0127 00:07:53.293221 6404 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0127 00:07:53.292667 6404 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Gro\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:19Z\\\",\\\"message\\\":\\\" 6745 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154432 6745 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154709 6745 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154794 6745 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.155077 6745 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:08:19.155178 6745 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 00:08:19.155195 6745 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 00:08:19.155224 6745 factory.go:656] Stopping watch factory\\\\nI0127 00:08:19.155243 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 00:08:19.188935 6745 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0127 00:08:19.188953 6745 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0127 00:08:19.189017 6745 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:08:19.189036 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0127 00:08:19.189166 6745 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.054499 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.066925 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.077792 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.088930 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.101999 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.112489 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.125269 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.138644 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.151672 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.156765 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.162644 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.173561 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.259207 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.347953 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 05:07:11.48996847 +0000 UTC Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.356436 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.356487 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.356515 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.356447 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.356655 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.356754 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.356847 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.356937 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.361550 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.463663 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.565837 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.668802 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.771297 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.874201 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.924367 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/3.log" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.929202 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:08:20 crc kubenswrapper[4774]: E0127 00:08:20.929834 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.940528 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.956419 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.964524 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.973574 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:20 crc kubenswrapper[4774]: I0127 00:08:20.976635 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:20Z","lastTransitionTime":"2026-01-27T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.000122 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:20Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.008959 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.016982 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.035329 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:19Z\\\",\\\"message\\\":\\\" 6745 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154432 6745 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154709 6745 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154794 6745 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.155077 6745 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:08:19.155178 6745 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 00:08:19.155195 6745 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 00:08:19.155224 6745 factory.go:656] Stopping watch factory\\\\nI0127 00:08:19.155243 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 00:08:19.188935 6745 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0127 00:08:19.188953 6745 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0127 00:08:19.189017 6745 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:08:19.189036 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0127 00:08:19.189166 6745 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.049371 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.063839 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.075755 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.078994 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.091156 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.106023 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.119153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.132772 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.145098 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.158613 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.170092 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:21Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.184159 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.287822 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.348594 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 15:23:02.115257284 +0000 UTC Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390138 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.390192 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.492425 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.595405 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.698421 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.801964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.802015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.802024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.802041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.802054 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:21 crc kubenswrapper[4774]: I0127 00:08:21.905116 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:21Z","lastTransitionTime":"2026-01-27T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.007771 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.110362 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.214433 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.317418 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.349687 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:23:24.4486159 +0000 UTC Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.356145 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.356210 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.356262 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:22 crc kubenswrapper[4774]: E0127 00:08:22.356381 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.356453 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:22 crc kubenswrapper[4774]: E0127 00:08:22.356587 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:22 crc kubenswrapper[4774]: E0127 00:08:22.356717 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:22 crc kubenswrapper[4774]: E0127 00:08:22.356780 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.371354 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.387618 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.402605 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.421537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.422412 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.440250 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.460258 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.475393 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.493621 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.505853 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.518889 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523911 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.523952 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.542657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.554937 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.566616 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.584985 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.607346 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:19Z\\\",\\\"message\\\":\\\" 6745 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154432 6745 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154709 6745 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154794 6745 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.155077 6745 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:08:19.155178 6745 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 00:08:19.155195 6745 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 00:08:19.155224 6745 factory.go:656] Stopping watch factory\\\\nI0127 00:08:19.155243 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 00:08:19.188935 6745 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0127 00:08:19.188953 6745 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0127 00:08:19.189017 6745 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:08:19.189036 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0127 00:08:19.189166 6745 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.624081 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.626265 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.644760 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.659602 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.729680 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.832990 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:22 crc kubenswrapper[4774]: I0127 00:08:22.935193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:22Z","lastTransitionTime":"2026-01-27T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.037940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.037999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.038019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.038039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.038051 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.141962 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.244915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.244984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.244997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.245015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.245048 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.347472 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.350572 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 21:22:12.743859777 +0000 UTC Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.449688 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.552137 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.654943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.654994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.655024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.655039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.655049 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.758716 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.861796 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:23 crc kubenswrapper[4774]: I0127 00:08:23.964483 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:23Z","lastTransitionTime":"2026-01-27T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.059763 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.070958 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.074190 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.084351 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.087664 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.098472 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.101487 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.112029 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.115232 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.126776 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.126955 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.128458 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.230651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.230919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.230933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.231038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.231054 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.333770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.333921 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.333943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.334482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.334539 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.351215 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 01:43:33.034718463 +0000 UTC Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.356782 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.356839 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.356810 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.357040 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.357028 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.357158 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.357259 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:24 crc kubenswrapper[4774]: E0127 00:08:24.357394 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.437373 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540451 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.540487 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.643555 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.746661 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.849347 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.952927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.952997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.953014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.953038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:24 crc kubenswrapper[4774]: I0127 00:08:24.953059 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:24Z","lastTransitionTime":"2026-01-27T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.056470 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159356 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.159373 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.262440 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.351444 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 06:57:05.409691664 +0000 UTC Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.365566 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.469156 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573207 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.573247 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.676597 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.778580 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.881667 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983801 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:25 crc kubenswrapper[4774]: I0127 00:08:25.983821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:25Z","lastTransitionTime":"2026-01-27T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.086537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.189910 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.205845 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206051 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.2060179 +0000 UTC m=+148.511794824 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.206118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206258 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.206267 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206327 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.206309617 +0000 UTC m=+148.512086521 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206451 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.206600 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.206583355 +0000 UTC m=+148.512360289 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292458 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.292578 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.307145 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.307187 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307335 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307359 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307371 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307420 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.307402991 +0000 UTC m=+148.613179885 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307552 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307618 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307638 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.307713 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.307693158 +0000 UTC m=+148.613470082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.352183 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 17:21:55.672530673 +0000 UTC Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.356618 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.356700 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.356800 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.356849 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.356896 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.356978 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.357108 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:26 crc kubenswrapper[4774]: E0127 00:08:26.357357 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.395338 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.498381 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.601978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.704806 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.807845 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:26 crc kubenswrapper[4774]: I0127 00:08:26.913929 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:26Z","lastTransitionTime":"2026-01-27T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.016804 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.119809 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.223818 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.328633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.329131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.329157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.329189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.329213 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.353208 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 08:21:34.205648199 +0000 UTC Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.432979 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534811 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.534880 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637900 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637931 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.637942 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.741138 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.844213 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:27 crc kubenswrapper[4774]: I0127 00:08:27.946936 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:27Z","lastTransitionTime":"2026-01-27T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.049197 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.152268 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.254994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.255032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.255041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.255056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.255066 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.354381 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 09:20:08.750380122 +0000 UTC Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.355786 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.355786 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.355853 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.355917 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:28 crc kubenswrapper[4774]: E0127 00:08:28.355967 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:28 crc kubenswrapper[4774]: E0127 00:08:28.356045 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:28 crc kubenswrapper[4774]: E0127 00:08:28.356131 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:28 crc kubenswrapper[4774]: E0127 00:08:28.356241 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.357680 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459742 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.459805 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.562354 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.665967 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768468 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.768572 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.871785 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:28 crc kubenswrapper[4774]: I0127 00:08:28.975555 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:28Z","lastTransitionTime":"2026-01-27T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.078166 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.181361 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.283991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.354649 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:55:06.289352394 +0000 UTC Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.387256 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.490437 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.593602 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.698410 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.801917 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:29 crc kubenswrapper[4774]: I0127 00:08:29.904733 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:29Z","lastTransitionTime":"2026-01-27T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.007406 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110450 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.110470 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.214340 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.317094 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355478 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 18:01:58.924087451 +0000 UTC Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355632 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355688 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355693 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.355632 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:30 crc kubenswrapper[4774]: E0127 00:08:30.355796 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:30 crc kubenswrapper[4774]: E0127 00:08:30.355978 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:30 crc kubenswrapper[4774]: E0127 00:08:30.356061 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:30 crc kubenswrapper[4774]: E0127 00:08:30.356125 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.420823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.420991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.421022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.421077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.421105 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.524385 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.627729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.730894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.730973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.730986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.731007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.731020 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.834521 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:30 crc kubenswrapper[4774]: I0127 00:08:30.938219 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:30Z","lastTransitionTime":"2026-01-27T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.041496 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.144908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.144999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.145009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.145035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.145051 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.248956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.249031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.249049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.249074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.249092 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.352314 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.356508 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 23:03:21.817513257 +0000 UTC Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.372838 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.454719 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.557811 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.662837 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.767461 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.871763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.871896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.871929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.872007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.872033 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:31 crc kubenswrapper[4774]: I0127 00:08:31.974916 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:31Z","lastTransitionTime":"2026-01-27T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.078855 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.182389 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.285329 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356136 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356213 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356162 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356205 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:32 crc kubenswrapper[4774]: E0127 00:08:32.356325 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:32 crc kubenswrapper[4774]: E0127 00:08:32.356525 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:32 crc kubenswrapper[4774]: E0127 00:08:32.356620 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:32 crc kubenswrapper[4774]: E0127 00:08:32.356674 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.356736 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 11:41:23.282697993 +0000 UTC Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.373124 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df2835acc23f4db80733c12c0a5baa5cd473c5acdf8f65d3c8e5225a8c31ae9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75fae1fc321d6fb9fff05b945d899bd3d208235b5f4eec657da29eb32df987c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.385672 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-g4cnl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c687c86-483c-433c-8a0c-a232c5d48974\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a0add0aaab48c613e172f3cdbad238d18a6bd8071eebed4b785dc8fd7964508\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8f7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:26Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-g4cnl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.388328 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.400902 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"936f5a31-13bf-456e-83d8-1aee977d2af5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3307ad32cef40246495580f9e0e310d6fbb2f2937353f7463d329530719f41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb171228ce5e3d97f70fc6fcd8bf07af35e8a942b46eb37e02fea914b9235a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnp67\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wbzx6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.421049 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a62b292b-334d-470c-97c4-b86640ebc5bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"file observer\\\\nW0127 00:07:22.161051 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 00:07:22.161212 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 00:07:22.161840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-379538748/tls.crt::/tmp/serving-cert-379538748/tls.key\\\\\\\"\\\\nI0127 00:07:22.727160 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 00:07:22.730051 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 00:07:22.730071 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 00:07:22.730093 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 00:07:22.730098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 00:07:22.737506 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 00:07:22.737541 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737545 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 00:07:22.737549 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 00:07:22.737552 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 00:07:22.737555 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 00:07:22.737558 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 00:07:22.737817 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 00:07:22.738789 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.440199 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8mtkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"052efff3-b53a-4586-8ccf-f8e9b1a47174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f1f244833ecb115bea42c24b60687ac604cae35d4bbea489b25c1aab6e1396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jb889\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8mtkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.453687 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6djzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xtfjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6djzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.480790 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"901f0d66-b649-40cf-99d0-f161e63e2864\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d15df89a89ea450e5354f2330854ac8e3f4c6bd336d3f7336455a76c99363e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768c75389653697aeb73072ca4df08ef7cde1d41d98d6916c9d11c16c28f3202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30a81309dbe2ecdd125a280c3ab9a7a7d3425887d24c96838573290b534a3883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2085c1cd006d4a9c3fad9dd56b713b21efa9e9574e74d4c21b3eb6ed41affcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee74791f22f2079c3c499648b3415dfbce4efee4f08fda5feda41848544bc9fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2f710cc602a2e3695b6d9b0bc1039dbeecccd55ea0ad3d7571cc977b828283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539c46403b94977514306f0d7801e84bcb4be5691d2f5c88ec6bfcee2720ddc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ddb9eb9e0ecba4c40c764b93471413640fc997a11bc97b7477be060c8bd1b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.491218 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.504712 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f562fe596d54ab79e4a5325b8dfde89dc68a7ebdf2bb6ce07121592304f9e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.521178 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.540437 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://929230765c7ac577ff90632d8180e5d5e7fb2018359caeb9758adfef79b86590\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.557654 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-57k5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c578fd22-1672-431a-9914-66f55a0260bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acaba411b3b2a11bccffd288d5c4f1b47266667d73b2e73ea0ed17fe4768f341\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff79994ffdcb03d27f85c8ae23d0c2762fe781ee502ee4c97b42090a59e57c5a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab269b21041671bc47aaa25b844fcced8d648b0c707ed4d93da48fd470ffe52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b6083b17afebb6a009bed7ed15175a59d38ebabf7c97fbef79b2fb8bf196f74\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06679ccb8b78e4912e6f9043576ab223e3f3619a25a7ec067f9de72d33bb7d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80913307e1707bd454a40f59efa0c728a857e34876b7d4ef85e84c4f30a9cfc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ccf21f9eb1a570e686976f00b5ff9bc0420d3b3bbcba1148c5ac964a8dc842a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7b9bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-57k5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.579102 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db881c9d-a960-48ae-93bf-d0ccd687e0b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:19Z\\\",\\\"message\\\":\\\" 6745 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154432 6745 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154709 6745 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.154794 6745 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 00:08:19.155077 6745 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 00:08:19.155178 6745 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 00:08:19.155195 6745 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 00:08:19.155224 6745 factory.go:656] Stopping watch factory\\\\nI0127 00:08:19.155243 6745 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 00:08:19.188935 6745 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0127 00:08:19.188953 6745 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0127 00:08:19.189017 6745 ovnkube.go:599] Stopped ovnkube\\\\nI0127 00:08:19.189036 6745 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0127 00:08:19.189166 6745 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:08:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l92km\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-l5rgv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.593370 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8953bcc-27ee-4be1-a990-9f559ea6bbc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d0e4428feac8787d889cfa23715983ac112bac861e3e00ab7ae19ec676964eb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1aaa7b231202686b65f633283644052e65287917c7621e708f036c1a7278863b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1aaa7b231202686b65f633283644052e65287917c7621e708f036c1a7278863b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.594991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.607397 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95f84220-fb9c-4fec-bf69-715c647dff1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ff4e8b7267d66b736878f3f59bbd786cfd8b36b9b668cabb313caaeb0d92c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf2b4b44ebcc4b98c359beb599d620cad5874ce6965570921c07d0fe861f472\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f41e37bb28d10e4be9ce683e5ad5867b149d17c0d55f0d51e700482f12c6d7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.619722 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.631071 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.641904 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e1e5f86c009b4ca55c30ca73705443acfcc5405b98c125d449751cb556836b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dd9gc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2nl9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.665805 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mtz9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0abcf78e-9b05-4b89-94f3-4d3230886ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T00:08:11Z\\\",\\\"message\\\":\\\"2026-01-27T00:07:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7\\\\n2026-01-27T00:07:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d3710b68-47b2-4a51-9f54-4151ab15eea7 to /host/opt/cni/bin/\\\\n2026-01-27T00:07:26Z [verbose] multus-daemon started\\\\n2026-01-27T00:07:26Z [verbose] Readiness Indicator file check\\\\n2026-01-27T00:08:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmt7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mtz9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.679702 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c54e149-0291-4a04-b8ad-878d1b98529d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e72846faa6ea8f25cea14b6f23a073abff861b4ab77eb1c7ebde8203e2085fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://025411801b461e1bcd772cdaa30ddf9f31486cd9c78952ea110a46a969cef380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b91eb1e10156290e32cf22e65af22569278073a34dbaa4e728ba517486ab069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90f5ad33b77393e80aacf609b6b5a28950fc11628246773bcc1d5587c467f537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T00:07:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T00:07:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T00:07:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.698681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.698735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.698748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.698769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.699118 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.801997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.802032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.802042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.802055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.802064 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:32 crc kubenswrapper[4774]: I0127 00:08:32.905170 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:32Z","lastTransitionTime":"2026-01-27T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.009155 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.113150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217159 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.217181 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.320375 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.358001 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 06:30:47.952168566 +0000 UTC Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.424114 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.526402 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.629513 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.732549 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.835225 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938326 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938457 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:33 crc kubenswrapper[4774]: I0127 00:08:33.938481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:33Z","lastTransitionTime":"2026-01-27T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.040909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.040967 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.040980 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.041002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.041016 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.254682 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.264411 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.277826 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280939 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.280982 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.293389 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.299984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.300044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.300062 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.300099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.300115 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.314193 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.317478 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.328763 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.333354 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.346500 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T00:08:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"798433ee-0aed-45e3-8b2f-39b7bf5cbb06\\\",\\\"systemUUID\\\":\\\"e18b2370-db20-4c66-88f9-fff4652ef035\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T00:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.346729 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.356020 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.356062 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.356093 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.356036 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.356180 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.356283 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.356429 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:34 crc kubenswrapper[4774]: E0127 00:08:34.356548 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.357884 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.358140 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:48:53.728382865 +0000 UTC Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460387 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460456 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.460501 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.563805 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.667257 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.770813 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.874530 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:34 crc kubenswrapper[4774]: I0127 00:08:34.978248 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:34Z","lastTransitionTime":"2026-01-27T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.081431 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.185368 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.289938 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.357679 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:08:35 crc kubenswrapper[4774]: E0127 00:08:35.357975 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.358263 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 05:49:55.50460727 +0000 UTC Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.393334 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.496977 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.600593 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.704690 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.807793 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.911695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.911819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.911913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.911954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:35 crc kubenswrapper[4774]: I0127 00:08:35.912122 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:35Z","lastTransitionTime":"2026-01-27T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.015351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.015776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.016003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.016171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.016364 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.120555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.121365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.121527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.121673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.121808 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225644 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.225705 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.329230 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.355923 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.355960 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.356011 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:36 crc kubenswrapper[4774]: E0127 00:08:36.356140 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:36 crc kubenswrapper[4774]: E0127 00:08:36.356223 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:36 crc kubenswrapper[4774]: E0127 00:08:36.356379 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.355957 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:36 crc kubenswrapper[4774]: E0127 00:08:36.356520 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.358629 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 04:31:05.696620587 +0000 UTC Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.431962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.432558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.432759 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.433004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.433240 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.537372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.537753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.537830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.537975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.538050 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.642271 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.746390 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.849988 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953464 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:36 crc kubenswrapper[4774]: I0127 00:08:36.953896 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:36Z","lastTransitionTime":"2026-01-27T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.058136 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162708 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.162939 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.266422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.266786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.267098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.267264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.267419 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.359118 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 21:29:52.765344455 +0000 UTC Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.371237 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.473537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.576431 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.680246 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.783604 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886276 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.886384 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:37 crc kubenswrapper[4774]: I0127 00:08:37.989118 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:37Z","lastTransitionTime":"2026-01-27T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.091710 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.194305 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296644 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.296758 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.356247 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.356318 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.356442 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:38 crc kubenswrapper[4774]: E0127 00:08:38.356443 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:38 crc kubenswrapper[4774]: E0127 00:08:38.356594 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:38 crc kubenswrapper[4774]: E0127 00:08:38.356710 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.356827 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:38 crc kubenswrapper[4774]: E0127 00:08:38.356973 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.359206 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 17:02:54.561244176 +0000 UTC Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.398846 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.500821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603789 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.603818 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.707239 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.809991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913932 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:38 crc kubenswrapper[4774]: I0127 00:08:38.913998 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:38Z","lastTransitionTime":"2026-01-27T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.017101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.017568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.017716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.017965 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.018120 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.120741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.120971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.120988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.121009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.121025 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224453 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.224472 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.327408 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.359329 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 01:38:46.831525133 +0000 UTC Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.429906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.429966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.429981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.430003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.430019 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.532987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.533064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.533088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.533125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.533151 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.637489 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.740776 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843742 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.843757 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:39 crc kubenswrapper[4774]: I0127 00:08:39.946884 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:39Z","lastTransitionTime":"2026-01-27T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.049284 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.151751 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.254746 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.355643 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.355647 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.355701 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.355727 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:40 crc kubenswrapper[4774]: E0127 00:08:40.356410 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:40 crc kubenswrapper[4774]: E0127 00:08:40.356547 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:40 crc kubenswrapper[4774]: E0127 00:08:40.356603 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:40 crc kubenswrapper[4774]: E0127 00:08:40.356623 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.358083 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.359445 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 22:24:39.667592291 +0000 UTC Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462150 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.462283 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.565358 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.668215 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.770994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.771980 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.772025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.772061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.772098 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.875163 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978467 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:40 crc kubenswrapper[4774]: I0127 00:08:40.978486 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:40Z","lastTransitionTime":"2026-01-27T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.081908 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.184691 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288106 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.288186 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.360394 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:02:39.85864256 +0000 UTC Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.390645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.390944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.390976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.391003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.391020 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.493991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.494299 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.494422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.494534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.494674 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.598522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.598634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.598905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.598981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.599013 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701939 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.701996 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.805830 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.908918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.908988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.909006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.909029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:41 crc kubenswrapper[4774]: I0127 00:08:41.909047 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:41Z","lastTransitionTime":"2026-01-27T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.012957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.013016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.013038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.013053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.013063 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.116259 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.219288 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.323231 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.343047 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.343201 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.343253 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs podName:e639e1da-0d65-4d42-b1fc-23d5db91e9e6 nodeName:}" failed. No retries permitted until 2026-01-27 00:09:46.343239025 +0000 UTC m=+164.649015909 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs") pod "network-metrics-daemon-6djzf" (UID: "e639e1da-0d65-4d42-b1fc-23d5db91e9e6") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.356480 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.356611 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.356688 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.356889 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.356952 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.357175 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.357513 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:42 crc kubenswrapper[4774]: E0127 00:08:42.357614 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.360601 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 03:51:18.728686287 +0000 UTC Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.402066 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.402038951 podStartE2EDuration="48.402038951s" podCreationTimestamp="2026-01-27 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.40202761 +0000 UTC m=+100.707804584" watchObservedRunningTime="2026-01-27 00:08:42.402038951 +0000 UTC m=+100.707815865" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426589 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.426601 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.430261 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=74.430226652 podStartE2EDuration="1m14.430226652s" podCreationTimestamp="2026-01-27 00:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.429290667 +0000 UTC m=+100.735067561" watchObservedRunningTime="2026-01-27 00:08:42.430226652 +0000 UTC m=+100.736003576" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.518194 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podStartSLOduration=80.518162113 podStartE2EDuration="1m20.518162113s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.494024136 +0000 UTC m=+100.799801030" watchObservedRunningTime="2026-01-27 00:08:42.518162113 +0000 UTC m=+100.823939037" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.519284 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mtz9l" podStartSLOduration=80.519271091 podStartE2EDuration="1m20.519271091s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.517600408 +0000 UTC m=+100.823377382" watchObservedRunningTime="2026-01-27 00:08:42.519271091 +0000 UTC m=+100.825048015" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.529669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.530015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.530202 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.530355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.530492 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.544171 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.544134757 podStartE2EDuration="1m19.544134757s" podCreationTimestamp="2026-01-27 00:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.543951932 +0000 UTC m=+100.849728826" watchObservedRunningTime="2026-01-27 00:08:42.544134757 +0000 UTC m=+100.849911701" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.598177 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g4cnl" podStartSLOduration=80.598147519 podStartE2EDuration="1m20.598147519s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.578809687 +0000 UTC m=+100.884586611" watchObservedRunningTime="2026-01-27 00:08:42.598147519 +0000 UTC m=+100.903924443" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.633680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.634075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.634215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.634350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.634485 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.639163 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wbzx6" podStartSLOduration=80.639143062 podStartE2EDuration="1m20.639143062s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.598679662 +0000 UTC m=+100.904456556" watchObservedRunningTime="2026-01-27 00:08:42.639143062 +0000 UTC m=+100.944920006" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.645527 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.645509428 podStartE2EDuration="1m20.645509428s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.645281841 +0000 UTC m=+100.951058755" watchObservedRunningTime="2026-01-27 00:08:42.645509428 +0000 UTC m=+100.951286372" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.660416 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8mtkj" podStartSLOduration=80.66039056299999 podStartE2EDuration="1m20.660390563s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.659900151 +0000 UTC m=+100.965677035" watchObservedRunningTime="2026-01-27 00:08:42.660390563 +0000 UTC m=+100.966167467" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.706933 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=11.70691606 podStartE2EDuration="11.70691606s" podCreationTimestamp="2026-01-27 00:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.685899475 +0000 UTC m=+100.991676379" watchObservedRunningTime="2026-01-27 00:08:42.70691606 +0000 UTC m=+101.012692944" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.737200 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.772920 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-57k5g" podStartSLOduration=80.772896753 podStartE2EDuration="1m20.772896753s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:42.747202686 +0000 UTC m=+101.052979590" watchObservedRunningTime="2026-01-27 00:08:42.772896753 +0000 UTC m=+101.078673647" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839466 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.839551 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:42 crc kubenswrapper[4774]: I0127 00:08:42.959247 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:42Z","lastTransitionTime":"2026-01-27T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.061984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.062022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.062031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.062046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.062058 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164708 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.164766 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.267547 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.361223 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 03:34:09.736157344 +0000 UTC Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.370702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.473799 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.576915 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680449 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.680582 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.783576 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.888698 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:43 crc kubenswrapper[4774]: I0127 00:08:43.992366 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:43Z","lastTransitionTime":"2026-01-27T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.095547 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.198909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.198969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.198994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.199013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.199025 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.303254 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.355755 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.355803 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.355913 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.356147 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:44 crc kubenswrapper[4774]: E0127 00:08:44.356309 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:44 crc kubenswrapper[4774]: E0127 00:08:44.356507 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:44 crc kubenswrapper[4774]: E0127 00:08:44.356643 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:44 crc kubenswrapper[4774]: E0127 00:08:44.356823 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.362098 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 13:21:33.326757066 +0000 UTC Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.405569 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.460666 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T00:08:44Z","lastTransitionTime":"2026-01-27T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.535591 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5"] Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.536201 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.540509 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.541016 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.541055 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.541364 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577408 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e07c9c1-5915-4d23-808f-ea9f32dd336b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e07c9c1-5915-4d23-808f-ea9f32dd336b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577569 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577607 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e07c9c1-5915-4d23-808f-ea9f32dd336b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.577707 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.678843 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e07c9c1-5915-4d23-808f-ea9f32dd336b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.678995 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e07c9c1-5915-4d23-808f-ea9f32dd336b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679034 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e07c9c1-5915-4d23-808f-ea9f32dd336b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679078 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679152 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679266 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.679305 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e07c9c1-5915-4d23-808f-ea9f32dd336b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.680985 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e07c9c1-5915-4d23-808f-ea9f32dd336b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.688664 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e07c9c1-5915-4d23-808f-ea9f32dd336b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.710300 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e07c9c1-5915-4d23-808f-ea9f32dd336b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m48n5\" (UID: \"8e07c9c1-5915-4d23-808f-ea9f32dd336b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:44 crc kubenswrapper[4774]: I0127 00:08:44.861931 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" Jan 27 00:08:45 crc kubenswrapper[4774]: I0127 00:08:45.015746 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" event={"ID":"8e07c9c1-5915-4d23-808f-ea9f32dd336b","Type":"ContainerStarted","Data":"b3bb617afafc5f78c5e03844200edf310dba21eb0905965334eb25f5650c05b2"} Jan 27 00:08:45 crc kubenswrapper[4774]: I0127 00:08:45.363080 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 01:54:29.890262752 +0000 UTC Jan 27 00:08:45 crc kubenswrapper[4774]: I0127 00:08:45.363162 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 27 00:08:45 crc kubenswrapper[4774]: I0127 00:08:45.373682 4774 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.022441 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" event={"ID":"8e07c9c1-5915-4d23-808f-ea9f32dd336b","Type":"ContainerStarted","Data":"e95407216eda41a7ac427282f895fd215892e27a7669a58b3031682c532b4032"} Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.043433 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m48n5" podStartSLOduration=84.043406519 podStartE2EDuration="1m24.043406519s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:08:46.043176574 +0000 UTC m=+104.348953548" watchObservedRunningTime="2026-01-27 00:08:46.043406519 +0000 UTC m=+104.349183413" Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.356562 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.356677 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:46 crc kubenswrapper[4774]: E0127 00:08:46.356754 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.356571 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:46 crc kubenswrapper[4774]: I0127 00:08:46.356568 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:46 crc kubenswrapper[4774]: E0127 00:08:46.357067 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:46 crc kubenswrapper[4774]: E0127 00:08:46.357124 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:46 crc kubenswrapper[4774]: E0127 00:08:46.357249 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.356513 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.356719 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.356790 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.357085 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.357305 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.357701 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:48 crc kubenswrapper[4774]: I0127 00:08:48.357783 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.357970 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.357703 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:48 crc kubenswrapper[4774]: E0127 00:08:48.358006 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-l5rgv_openshift-ovn-kubernetes(db881c9d-a960-48ae-93bf-d0ccd687e0b9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" Jan 27 00:08:50 crc kubenswrapper[4774]: I0127 00:08:50.356121 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:50 crc kubenswrapper[4774]: E0127 00:08:50.356289 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:50 crc kubenswrapper[4774]: I0127 00:08:50.356496 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:50 crc kubenswrapper[4774]: E0127 00:08:50.356537 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:50 crc kubenswrapper[4774]: I0127 00:08:50.356651 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:50 crc kubenswrapper[4774]: E0127 00:08:50.356696 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:50 crc kubenswrapper[4774]: I0127 00:08:50.356801 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:50 crc kubenswrapper[4774]: E0127 00:08:50.356842 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:52 crc kubenswrapper[4774]: I0127 00:08:52.355915 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:52 crc kubenswrapper[4774]: I0127 00:08:52.355939 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:52 crc kubenswrapper[4774]: I0127 00:08:52.355939 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:52 crc kubenswrapper[4774]: I0127 00:08:52.357705 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:52 crc kubenswrapper[4774]: E0127 00:08:52.357691 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:52 crc kubenswrapper[4774]: E0127 00:08:52.357847 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:52 crc kubenswrapper[4774]: E0127 00:08:52.357891 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:52 crc kubenswrapper[4774]: E0127 00:08:52.358066 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:54 crc kubenswrapper[4774]: I0127 00:08:54.356743 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:54 crc kubenswrapper[4774]: I0127 00:08:54.356740 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:54 crc kubenswrapper[4774]: I0127 00:08:54.356761 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:54 crc kubenswrapper[4774]: I0127 00:08:54.357056 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:54 crc kubenswrapper[4774]: E0127 00:08:54.357070 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:54 crc kubenswrapper[4774]: E0127 00:08:54.357290 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:54 crc kubenswrapper[4774]: E0127 00:08:54.357502 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:54 crc kubenswrapper[4774]: E0127 00:08:54.357595 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:56 crc kubenswrapper[4774]: I0127 00:08:56.355669 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:56 crc kubenswrapper[4774]: I0127 00:08:56.355724 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:56 crc kubenswrapper[4774]: I0127 00:08:56.355665 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:56 crc kubenswrapper[4774]: I0127 00:08:56.355770 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:56 crc kubenswrapper[4774]: E0127 00:08:56.355804 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:56 crc kubenswrapper[4774]: E0127 00:08:56.356037 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:56 crc kubenswrapper[4774]: E0127 00:08:56.356060 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:56 crc kubenswrapper[4774]: E0127 00:08:56.356107 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065030 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/1.log" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065401 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/0.log" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065438 4774 generic.go:334] "Generic (PLEG): container finished" podID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" containerID="01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972" exitCode=1 Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065475 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerDied","Data":"01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972"} Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.065528 4774 scope.go:117] "RemoveContainer" containerID="62e181e8dbb766312f0232a704ca6d8f290752fd770fac2b0722ec2beef43f1b" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.066195 4774 scope.go:117] "RemoveContainer" containerID="01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.066474 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mtz9l_openshift-multus(0abcf78e-9b05-4b89-94f3-4d3230886ce0)\"" pod="openshift-multus/multus-mtz9l" podUID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.355621 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.355779 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.356031 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.356070 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.356133 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:08:58 crc kubenswrapper[4774]: I0127 00:08:58.356071 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.356226 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:08:58 crc kubenswrapper[4774]: E0127 00:08:58.356377 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:08:59 crc kubenswrapper[4774]: I0127 00:08:59.072144 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/1.log" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.356376 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:00 crc kubenswrapper[4774]: E0127 00:09:00.357083 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.356405 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:00 crc kubenswrapper[4774]: E0127 00:09:00.357187 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.356384 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.356398 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:00 crc kubenswrapper[4774]: E0127 00:09:00.357271 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:00 crc kubenswrapper[4774]: I0127 00:09:00.357326 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:09:00 crc kubenswrapper[4774]: E0127 00:09:00.357385 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.084380 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/3.log" Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.087652 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerStarted","Data":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.088110 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.114907 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podStartSLOduration=99.114830289 podStartE2EDuration="1m39.114830289s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:01.114313455 +0000 UTC m=+119.420090369" watchObservedRunningTime="2026-01-27 00:09:01.114830289 +0000 UTC m=+119.420607203" Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.212454 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6djzf"] Jan 27 00:09:01 crc kubenswrapper[4774]: I0127 00:09:01.212631 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:01 crc kubenswrapper[4774]: E0127 00:09:01.212780 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:02 crc kubenswrapper[4774]: I0127 00:09:02.355884 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:02 crc kubenswrapper[4774]: I0127 00:09:02.355909 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:02 crc kubenswrapper[4774]: I0127 00:09:02.358408 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.358440 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.358657 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.358685 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.370607 4774 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 27 00:09:02 crc kubenswrapper[4774]: E0127 00:09:02.457320 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 00:09:03 crc kubenswrapper[4774]: I0127 00:09:03.355818 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:03 crc kubenswrapper[4774]: E0127 00:09:03.356326 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:04 crc kubenswrapper[4774]: I0127 00:09:04.356392 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:04 crc kubenswrapper[4774]: I0127 00:09:04.356475 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:04 crc kubenswrapper[4774]: I0127 00:09:04.356397 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:04 crc kubenswrapper[4774]: E0127 00:09:04.356626 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:04 crc kubenswrapper[4774]: E0127 00:09:04.356729 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:04 crc kubenswrapper[4774]: E0127 00:09:04.357033 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:05 crc kubenswrapper[4774]: I0127 00:09:05.356484 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:05 crc kubenswrapper[4774]: E0127 00:09:05.356763 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:06 crc kubenswrapper[4774]: I0127 00:09:06.356447 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:06 crc kubenswrapper[4774]: I0127 00:09:06.356579 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:06 crc kubenswrapper[4774]: I0127 00:09:06.356702 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:06 crc kubenswrapper[4774]: E0127 00:09:06.356700 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:06 crc kubenswrapper[4774]: E0127 00:09:06.356946 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:06 crc kubenswrapper[4774]: E0127 00:09:06.357070 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:07 crc kubenswrapper[4774]: I0127 00:09:07.355598 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:07 crc kubenswrapper[4774]: E0127 00:09:07.355938 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:07 crc kubenswrapper[4774]: E0127 00:09:07.458753 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 00:09:08 crc kubenswrapper[4774]: I0127 00:09:08.356122 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:08 crc kubenswrapper[4774]: I0127 00:09:08.356173 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:08 crc kubenswrapper[4774]: I0127 00:09:08.356245 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:08 crc kubenswrapper[4774]: E0127 00:09:08.356430 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:08 crc kubenswrapper[4774]: E0127 00:09:08.356614 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:08 crc kubenswrapper[4774]: E0127 00:09:08.356820 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:09 crc kubenswrapper[4774]: I0127 00:09:09.355814 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:09 crc kubenswrapper[4774]: E0127 00:09:09.356045 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:10 crc kubenswrapper[4774]: I0127 00:09:10.356292 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:10 crc kubenswrapper[4774]: I0127 00:09:10.356308 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:10 crc kubenswrapper[4774]: E0127 00:09:10.356598 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:10 crc kubenswrapper[4774]: I0127 00:09:10.356935 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:10 crc kubenswrapper[4774]: E0127 00:09:10.357037 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:10 crc kubenswrapper[4774]: E0127 00:09:10.357390 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:10 crc kubenswrapper[4774]: I0127 00:09:10.357688 4774 scope.go:117] "RemoveContainer" containerID="01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972" Jan 27 00:09:11 crc kubenswrapper[4774]: I0127 00:09:11.127811 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/1.log" Jan 27 00:09:11 crc kubenswrapper[4774]: I0127 00:09:11.128255 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c"} Jan 27 00:09:11 crc kubenswrapper[4774]: I0127 00:09:11.356126 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:11 crc kubenswrapper[4774]: E0127 00:09:11.356406 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:12 crc kubenswrapper[4774]: I0127 00:09:12.356142 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:12 crc kubenswrapper[4774]: I0127 00:09:12.356132 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:12 crc kubenswrapper[4774]: I0127 00:09:12.357611 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:12 crc kubenswrapper[4774]: E0127 00:09:12.357756 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:12 crc kubenswrapper[4774]: E0127 00:09:12.357842 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:12 crc kubenswrapper[4774]: E0127 00:09:12.357962 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:12 crc kubenswrapper[4774]: E0127 00:09:12.460217 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 00:09:13 crc kubenswrapper[4774]: I0127 00:09:13.355966 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:13 crc kubenswrapper[4774]: E0127 00:09:13.356133 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:14 crc kubenswrapper[4774]: I0127 00:09:14.356516 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:14 crc kubenswrapper[4774]: I0127 00:09:14.356594 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:14 crc kubenswrapper[4774]: I0127 00:09:14.356601 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:14 crc kubenswrapper[4774]: E0127 00:09:14.356743 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:14 crc kubenswrapper[4774]: E0127 00:09:14.356837 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:14 crc kubenswrapper[4774]: E0127 00:09:14.356894 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:15 crc kubenswrapper[4774]: I0127 00:09:15.355558 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:15 crc kubenswrapper[4774]: E0127 00:09:15.355765 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:16 crc kubenswrapper[4774]: I0127 00:09:16.356464 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:16 crc kubenswrapper[4774]: I0127 00:09:16.356551 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:16 crc kubenswrapper[4774]: I0127 00:09:16.356461 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:16 crc kubenswrapper[4774]: E0127 00:09:16.356745 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 00:09:16 crc kubenswrapper[4774]: E0127 00:09:16.356649 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 00:09:16 crc kubenswrapper[4774]: E0127 00:09:16.357012 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 00:09:17 crc kubenswrapper[4774]: I0127 00:09:17.356098 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:17 crc kubenswrapper[4774]: E0127 00:09:17.356279 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6djzf" podUID="e639e1da-0d65-4d42-b1fc-23d5db91e9e6" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.356485 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.356563 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.357507 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.360439 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.360583 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.360742 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 00:09:18 crc kubenswrapper[4774]: I0127 00:09:18.360761 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 00:09:19 crc kubenswrapper[4774]: I0127 00:09:19.357146 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:19 crc kubenswrapper[4774]: I0127 00:09:19.360281 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 00:09:19 crc kubenswrapper[4774]: I0127 00:09:19.360580 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.330297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.388819 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gstl6"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.389284 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.393572 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j4bbk"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.394085 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kv45n"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.394393 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.394739 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.395256 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.395449 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.396157 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.396487 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.399855 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.401041 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.404408 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.404932 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.405460 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.413244 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.413519 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.414005 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.414367 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.414421 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.414501 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415033 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415178 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415515 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415620 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415669 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415950 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415415 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415191 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415396 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.416722 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.416549 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.417115 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.415462 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.418594 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.419154 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.419651 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.419892 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.419927 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.420164 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.420595 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.420986 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.422187 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.422355 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.422536 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.425932 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.426760 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.428577 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.430407 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.435843 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.439038 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.444461 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.444818 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.445334 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.445604 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.445849 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.446044 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.446246 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.446500 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.446813 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.447140 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450055 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450203 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450384 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450581 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450776 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.450933 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w8qm5"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.451470 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.451998 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r9m95"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.452634 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.452908 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.468355 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.469974 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.470632 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.471496 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.483134 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.491094 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.491565 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.491925 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.492088 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.492875 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29491200-4blsg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.493121 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.493448 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494161 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494211 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-encryption-config\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494228 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l72t8\" (UniqueName: \"kubernetes.io/projected/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-kube-api-access-l72t8\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494246 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-node-pullsecrets\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494276 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494290 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-image-import-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494310 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-policies\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494334 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrbdc\" (UniqueName: \"kubernetes.io/projected/98696d49-1f48-4d0d-9e26-691558f704c5-kube-api-access-nrbdc\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494351 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494367 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-client\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494386 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494402 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-audit\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494419 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-serving-cert\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/1df48642-0cb9-4242-ae92-16b239115170-kube-api-access-hj8bl\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494473 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb54417-3e44-4c36-a1cf-438a705f8dcf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494495 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-config\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494516 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-serving-cert\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494537 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btcqh\" (UniqueName: \"kubernetes.io/projected/7cab68e6-3113-4536-b5c3-2293265c9502-kube-api-access-btcqh\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494555 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz5lb\" (UniqueName: \"kubernetes.io/projected/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-kube-api-access-tz5lb\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494572 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956bdf0f-108e-4966-bbb6-77be23255cec-serving-cert\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494591 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1df48642-0cb9-4242-ae92-16b239115170-machine-approver-tls\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494606 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-dir\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494627 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494645 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494651 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494670 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494692 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-auth-proxy-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494710 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-images\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494728 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-client\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494747 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494769 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db8adc8a-118c-485d-b752-c3d0a2888458-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494813 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-config\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494830 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-trusted-ca\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494849 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59zpq\" (UniqueName: \"kubernetes.io/projected/7cb54417-3e44-4c36-a1cf-438a705f8dcf-kube-api-access-59zpq\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494886 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494900 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-config\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494915 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-service-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494933 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494947 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494964 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494982 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.494997 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495014 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-encryption-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495030 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cab68e6-3113-4536-b5c3-2293265c9502-metrics-tls\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495046 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-audit-dir\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495062 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495094 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-serving-cert\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495111 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495126 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksgtc\" (UniqueName: \"kubernetes.io/projected/db8adc8a-118c-485d-b752-c3d0a2888458-kube-api-access-ksgtc\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.495140 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw55r\" (UniqueName: \"kubernetes.io/projected/956bdf0f-108e-4966-bbb6-77be23255cec-kube-api-access-dw55r\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.496196 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.493438 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.496978 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.497030 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.497139 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.497220 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.497287 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.501829 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.502244 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.502435 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.502637 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.502835 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.503086 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.503195 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.503489 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.503894 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.507033 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nt6rp"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.507682 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.513735 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.516497 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.516764 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.516902 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.517027 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.517164 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.517970 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.518093 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.518192 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.518290 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.518381 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.522608 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.523149 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.528944 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-776sg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529287 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529450 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gstl6"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529473 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529768 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529990 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.529991 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.530887 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.534916 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm9kj"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.536789 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.537184 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.537476 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.537763 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.537783 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539361 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539485 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539609 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539648 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539714 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.539780 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.540001 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.540705 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.541998 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.542628 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.543077 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.544602 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.545151 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.545316 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.545747 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.545877 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.548202 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-44dpr"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.558335 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.560451 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.577821 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.578007 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.578101 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.582576 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.584435 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.586580 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.589537 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.590138 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.591415 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.591471 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.591630 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.591632 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.595937 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kv45n"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596035 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cab68e6-3113-4536-b5c3-2293265c9502-metrics-tls\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596455 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596461 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596760 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596809 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/459b67d7-cbf3-4890-a9a2-3e143ee459aa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596844 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35026702-de7c-4f5f-8714-f1d7f89adae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596898 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l72t8\" (UniqueName: \"kubernetes.io/projected/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-kube-api-access-l72t8\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596932 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e6f70216-3d55-4dd8-81f7-f2129a277407-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596991 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.596903 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z5sz"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.597291 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.597838 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.599975 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.601022 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-node-pullsecrets\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.601133 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-node-pullsecrets\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.601193 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602006 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckh7l\" (UniqueName: \"kubernetes.io/projected/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-kube-api-access-ckh7l\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602090 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrbdc\" (UniqueName: \"kubernetes.io/projected/98696d49-1f48-4d0d-9e26-691558f704c5-kube-api-access-nrbdc\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602128 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602156 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b20433-569d-4f1d-acd8-127119a934e1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602177 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftjqz\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-kube-api-access-ftjqz\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602219 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-service-ca\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602240 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-audit\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-serving-cert\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602286 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-serving-cert\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btcqh\" (UniqueName: \"kubernetes.io/projected/7cab68e6-3113-4536-b5c3-2293265c9502-kube-api-access-btcqh\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602329 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz5lb\" (UniqueName: \"kubernetes.io/projected/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-kube-api-access-tz5lb\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602350 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956bdf0f-108e-4966-bbb6-77be23255cec-serving-cert\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602372 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-metrics-certs\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602393 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602418 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5be86044-916e-44ba-9855-360b0ae2471a-proxy-tls\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602465 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lnl\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-kube-api-access-b9lnl\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-client\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602502 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-config\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602520 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602561 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e83c76a-d44d-4a1b-b904-857dad56b5ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602591 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602616 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd12b52-828d-4244-99e9-2291f3a0bbb6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602640 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-auth-proxy-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602661 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-images\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602683 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602701 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602718 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-serving-cert\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602741 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602765 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-stats-auth\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602784 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602803 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-trusted-ca-bundle\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602822 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db8adc8a-118c-485d-b752-c3d0a2888458-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602875 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-trusted-ca\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602896 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602914 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwq54\" (UniqueName: \"kubernetes.io/projected/193a08ea-86ea-4176-898e-6a2c476ea6e9-kube-api-access-nwq54\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602935 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602952 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-service-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-config\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602990 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603011 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b20433-569d-4f1d-acd8-127119a934e1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603030 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vrd\" (UniqueName: \"kubernetes.io/projected/607f973b-fc78-4c11-bc09-fdbbe414d8c7-kube-api-access-c9vrd\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603048 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzc9x\" (UniqueName: \"kubernetes.io/projected/5be86044-916e-44ba-9855-360b0ae2471a-kube-api-access-fzc9x\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603068 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603088 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607f973b-fc78-4c11-bc09-fdbbe414d8c7-service-ca-bundle\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603109 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603129 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79th\" (UniqueName: \"kubernetes.io/projected/ec68f29b-2b30-4dff-b875-7617466be51b-kube-api-access-q79th\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603148 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c9ce6bb-224d-498c-9299-762d84b0eaa3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603167 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-encryption-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603183 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-images\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-audit-dir\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603218 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603245 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-serving-cert\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603261 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603277 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksgtc\" (UniqueName: \"kubernetes.io/projected/db8adc8a-118c-485d-b752-c3d0a2888458-kube-api-access-ksgtc\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603295 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw55r\" (UniqueName: \"kubernetes.io/projected/956bdf0f-108e-4966-bbb6-77be23255cec-kube-api-access-dw55r\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603313 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e83c76a-d44d-4a1b-b904-857dad56b5ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603333 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-encryption-config\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603350 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603367 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd12b52-828d-4244-99e9-2291f3a0bbb6-config\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603397 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-image-import-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603437 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-policies\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603460 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603480 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-client\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603498 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603518 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244zr\" (UniqueName: \"kubernetes.io/projected/05b20433-569d-4f1d-acd8-127119a934e1-kube-api-access-244zr\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603539 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc8l7\" (UniqueName: \"kubernetes.io/projected/475a4aef-33e7-40ad-b7c4-20a10efa4ec3-kube-api-access-cc8l7\") pod \"downloads-7954f5f757-nt6rp\" (UID: \"475a4aef-33e7-40ad-b7c4-20a10efa4ec3\") " pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603559 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/1df48642-0cb9-4242-ae92-16b239115170-kube-api-access-hj8bl\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603575 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb54417-3e44-4c36-a1cf-438a705f8dcf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603610 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-config\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603628 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/459b67d7-cbf3-4890-a9a2-3e143ee459aa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1df48642-0cb9-4242-ae92-16b239115170-machine-approver-tls\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603667 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-dir\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603683 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-oauth-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603701 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec68f29b-2b30-4dff-b875-7617466be51b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-oauth-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603761 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-default-certificate\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603782 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603800 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhl8d\" (UniqueName: \"kubernetes.io/projected/7e83c76a-d44d-4a1b-b904-857dad56b5ba-kube-api-access-bhl8d\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603820 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.603839 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c9ce6bb-224d-498c-9299-762d84b0eaa3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.604791 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-serving-cert\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.605422 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-audit\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.602375 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.606915 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.607262 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.607440 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.607982 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-auth-proxy-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.608133 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-serving-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.608819 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-image-import-ca\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.608992 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.609100 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-images\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.609394 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-policies\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.610101 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.611092 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-encryption-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.611140 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98696d49-1f48-4d0d-9e26-691558f704c5-audit-dir\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.611371 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-serving-cert\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.611793 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.612657 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.613215 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-client\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.614284 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.615131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-config\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.615208 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-audit-dir\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.615606 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1df48642-0cb9-4242-ae92-16b239115170-config\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.615830 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-config\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616114 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-etcd-client\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616168 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616202 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459b67d7-cbf3-4890-a9a2-3e143ee459aa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616226 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616236 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616257 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4648\" (UniqueName: \"kubernetes.io/projected/e6f70216-3d55-4dd8-81f7-f2129a277407-kube-api-access-q4648\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616281 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616362 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-config\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616385 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616411 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616434 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59zpq\" (UniqueName: \"kubernetes.io/projected/7cb54417-3e44-4c36-a1cf-438a705f8dcf-kube-api-access-59zpq\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616455 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-config\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616477 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f70216-3d55-4dd8-81f7-f2129a277407-serving-cert\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616497 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd12b52-828d-4244-99e9-2291f3a0bbb6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616518 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616553 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616579 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.616606 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-service-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.617294 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-config\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.617485 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618042 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb54417-3e44-4c36-a1cf-438a705f8dcf-config\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618083 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-serving-cert\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618108 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618137 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35026702-de7c-4f5f-8714-f1d7f89adae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618161 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.618182 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.619446 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.619656 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cab68e6-3113-4536-b5c3-2293265c9502-metrics-tls\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.620911 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98696d49-1f48-4d0d-9e26-691558f704c5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.621667 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.621675 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-service-ca-bundle\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.622173 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.622260 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.622384 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.622568 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.623291 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/956bdf0f-108e-4966-bbb6-77be23255cec-trusted-ca\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.623315 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.623414 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/956bdf0f-108e-4966-bbb6-77be23255cec-serving-cert\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.623876 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7cb54417-3e44-4c36-a1cf-438a705f8dcf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.624925 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.625338 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.625723 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.625745 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1df48642-0cb9-4242-ae92-16b239115170-machine-approver-tls\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.625936 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.626055 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-encryption-config\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.626082 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.626842 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.627233 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98696d49-1f48-4d0d-9e26-691558f704c5-etcd-client\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.627470 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.628225 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.630227 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.630438 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/db8adc8a-118c-485d-b752-c3d0a2888458-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.630713 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.631743 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8kqjp"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.632652 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.633562 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.643584 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.649251 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.651948 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j4bbk"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.652065 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.653069 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nt6rp"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.654352 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r9m95"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.655583 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29491200-4blsg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.658088 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w8qm5"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.659190 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ttgpc"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.659920 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.660255 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.661361 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.662426 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm9kj"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.663635 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.664669 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.665925 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.666974 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.668159 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.669151 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.672529 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.674410 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.676240 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.679977 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.682015 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-776sg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.683693 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.684822 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.685896 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.686888 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.687928 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.689195 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z5sz"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.690255 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.691234 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8kqjp"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.692262 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.693622 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m59xl"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.694984 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-th968"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.695147 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.696394 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.696441 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.696581 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-th968" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.697966 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.698530 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.699554 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ttgpc"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.700551 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.701747 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.702952 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m59xl"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.703949 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-th968"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.705081 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pjtrg"] Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.705841 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.711609 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719386 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckh7l\" (UniqueName: \"kubernetes.io/projected/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-kube-api-access-ckh7l\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719425 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719467 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719492 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftjqz\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-kube-api-access-ftjqz\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719514 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719534 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b20433-569d-4f1d-acd8-127119a934e1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719556 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-service-ca\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719594 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-metrics-certs\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719638 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5be86044-916e-44ba-9855-360b0ae2471a-proxy-tls\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719659 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lnl\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-kube-api-access-b9lnl\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719679 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-client\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719699 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-config\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719740 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719775 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e83c76a-d44d-4a1b-b904-857dad56b5ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719799 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd12b52-828d-4244-99e9-2291f3a0bbb6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719818 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719836 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719888 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-serving-cert\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719910 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-stats-auth\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719933 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719955 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-trusted-ca-bundle\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719978 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.719998 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwq54\" (UniqueName: \"kubernetes.io/projected/193a08ea-86ea-4176-898e-6a2c476ea6e9-kube-api-access-nwq54\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720021 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-config\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720038 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720058 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b20433-569d-4f1d-acd8-127119a934e1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720077 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vrd\" (UniqueName: \"kubernetes.io/projected/607f973b-fc78-4c11-bc09-fdbbe414d8c7-kube-api-access-c9vrd\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720098 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzc9x\" (UniqueName: \"kubernetes.io/projected/5be86044-916e-44ba-9855-360b0ae2471a-kube-api-access-fzc9x\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607f973b-fc78-4c11-bc09-fdbbe414d8c7-service-ca-bundle\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720141 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c9ce6bb-224d-498c-9299-762d84b0eaa3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720163 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q79th\" (UniqueName: \"kubernetes.io/projected/ec68f29b-2b30-4dff-b875-7617466be51b-kube-api-access-q79th\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720182 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-images\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720226 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e83c76a-d44d-4a1b-b904-857dad56b5ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720250 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720301 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd12b52-828d-4244-99e9-2291f3a0bbb6-config\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720332 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244zr\" (UniqueName: \"kubernetes.io/projected/05b20433-569d-4f1d-acd8-127119a934e1-kube-api-access-244zr\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720353 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc8l7\" (UniqueName: \"kubernetes.io/projected/475a4aef-33e7-40ad-b7c4-20a10efa4ec3-kube-api-access-cc8l7\") pod \"downloads-7954f5f757-nt6rp\" (UID: \"475a4aef-33e7-40ad-b7c4-20a10efa4ec3\") " pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720385 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/459b67d7-cbf3-4890-a9a2-3e143ee459aa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-oauth-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720428 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-oauth-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720457 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec68f29b-2b30-4dff-b875-7617466be51b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720490 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-default-certificate\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720520 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhl8d\" (UniqueName: \"kubernetes.io/projected/7e83c76a-d44d-4a1b-b904-857dad56b5ba-kube-api-access-bhl8d\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720563 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c9ce6bb-224d-498c-9299-762d84b0eaa3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720591 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720596 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b20433-569d-4f1d-acd8-127119a934e1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720613 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459b67d7-cbf3-4890-a9a2-3e143ee459aa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720660 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4648\" (UniqueName: \"kubernetes.io/projected/e6f70216-3d55-4dd8-81f7-f2129a277407-kube-api-access-q4648\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720701 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720713 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720757 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720755 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-service-ca\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720787 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f70216-3d55-4dd8-81f7-f2129a277407-serving-cert\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720810 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd12b52-828d-4244-99e9-2291f3a0bbb6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720830 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720852 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720888 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720906 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-service-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720942 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35026702-de7c-4f5f-8714-f1d7f89adae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720965 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.720986 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721017 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/459b67d7-cbf3-4890-a9a2-3e143ee459aa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721040 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35026702-de7c-4f5f-8714-f1d7f89adae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721060 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e6f70216-3d55-4dd8-81f7-f2129a277407-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721081 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.721793 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722100 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722120 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-oauth-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722178 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e83c76a-d44d-4a1b-b904-857dad56b5ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722438 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.722574 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c9ce6bb-224d-498c-9299-762d84b0eaa3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.724891 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.725194 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.725351 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35026702-de7c-4f5f-8714-f1d7f89adae6-trusted-ca\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.725847 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e6f70216-3d55-4dd8-81f7-f2129a277407-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.725915 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.727142 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-trusted-ca-bundle\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.728546 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.730380 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.730440 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.731193 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.731621 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.735166 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.735967 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.736173 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e83c76a-d44d-4a1b-b904-857dad56b5ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.736318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c9ce6bb-224d-498c-9299-762d84b0eaa3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738053 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738168 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f70216-3d55-4dd8-81f7-f2129a277407-serving-cert\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738184 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b20433-569d-4f1d-acd8-127119a934e1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738291 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-client\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738509 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-serving-cert\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.738803 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-oauth-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.739495 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.752874 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.753894 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.772761 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.791312 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.811941 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.815629 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-etcd-service-ca\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.831206 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.841171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/35026702-de7c-4f5f-8714-f1d7f89adae6-metrics-tls\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.851972 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.862745 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-serving-cert\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.873335 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.892518 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.897549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-config\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.911734 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.932354 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.952260 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 00:09:25 crc kubenswrapper[4774]: I0127 00:09:25.972353 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.006075 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.013281 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.031850 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.033436 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/193a08ea-86ea-4176-898e-6a2c476ea6e9-console-config\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.053006 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.057769 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.071438 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.081724 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-config\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.112201 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.113132 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd12b52-828d-4244-99e9-2291f3a0bbb6-config\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.131840 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.152758 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.165965 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd12b52-828d-4244-99e9-2291f3a0bbb6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.171846 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.193392 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.212356 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.221229 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-stats-auth\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.233167 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.245382 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-metrics-certs\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.252590 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.272012 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.277459 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/607f973b-fc78-4c11-bc09-fdbbe414d8c7-default-certificate\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.292594 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.298449 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/607f973b-fc78-4c11-bc09-fdbbe414d8c7-service-ca-bundle\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.312915 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.333909 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.352779 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.372529 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.379742 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/459b67d7-cbf3-4890-a9a2-3e143ee459aa-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.393337 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.396951 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/459b67d7-cbf3-4890-a9a2-3e143ee459aa-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.412931 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.422478 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5be86044-916e-44ba-9855-360b0ae2471a-images\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.432901 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.447037 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec68f29b-2b30-4dff-b875-7617466be51b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.452833 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.472272 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.476694 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5be86044-916e-44ba-9855-360b0ae2471a-proxy-tls\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.491709 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.532037 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.552020 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.572777 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.592801 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.609969 4774 request.go:700] Waited for 1.010469645s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.612192 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.632485 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.673115 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l72t8\" (UniqueName: \"kubernetes.io/projected/86bc63c8-36a5-4c7a-bf5d-c8f93744a251-kube-api-access-l72t8\") pod \"apiserver-7bbb656c7d-9r6fm\" (UID: \"86bc63c8-36a5-4c7a-bf5d-c8f93744a251\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.690356 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrbdc\" (UniqueName: \"kubernetes.io/projected/98696d49-1f48-4d0d-9e26-691558f704c5-kube-api-access-nrbdc\") pod \"apiserver-76f77b778f-j4bbk\" (UID: \"98696d49-1f48-4d0d-9e26-691558f704c5\") " pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.691831 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.712552 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.715105 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz5lb\" (UniqueName: \"kubernetes.io/projected/c64d4fcd-470f-43e8-bab0-09c0b59eaf6d-kube-api-access-tz5lb\") pod \"authentication-operator-69f744f599-gstl6\" (UID: \"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.732722 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.759485 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.770817 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") pod \"controller-manager-879f6c89f-5b8h8\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.792569 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.793676 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btcqh\" (UniqueName: \"kubernetes.io/projected/7cab68e6-3113-4536-b5c3-2293265c9502-kube-api-access-btcqh\") pod \"dns-operator-744455d44c-w8qm5\" (UID: \"7cab68e6-3113-4536-b5c3-2293265c9502\") " pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.834762 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") pod \"route-controller-manager-6576b87f9c-gthj4\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.843386 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59zpq\" (UniqueName: \"kubernetes.io/projected/7cb54417-3e44-4c36-a1cf-438a705f8dcf-kube-api-access-59zpq\") pod \"machine-api-operator-5694c8668f-kv45n\" (UID: \"7cb54417-3e44-4c36-a1cf-438a705f8dcf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.855438 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksgtc\" (UniqueName: \"kubernetes.io/projected/db8adc8a-118c-485d-b752-c3d0a2888458-kube-api-access-ksgtc\") pod \"cluster-samples-operator-665b6dd947-5rsb9\" (UID: \"db8adc8a-118c-485d-b752-c3d0a2888458\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.867458 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.870381 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw55r\" (UniqueName: \"kubernetes.io/projected/956bdf0f-108e-4966-bbb6-77be23255cec-kube-api-access-dw55r\") pod \"console-operator-58897d9998-r9m95\" (UID: \"956bdf0f-108e-4966-bbb6-77be23255cec\") " pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.891754 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.894727 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/1df48642-0cb9-4242-ae92-16b239115170-kube-api-access-hj8bl\") pod \"machine-approver-56656f9798-p6qnr\" (UID: \"1df48642-0cb9-4242-ae92-16b239115170\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.917249 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.932542 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.950125 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.952136 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.983825 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.992910 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 00:09:26 crc kubenswrapper[4774]: I0127 00:09:26.994166 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-j4bbk"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.001027 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" Jan 27 00:09:27 crc kubenswrapper[4774]: W0127 00:09:27.003055 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98696d49_1f48_4d0d_9e26_691558f704c5.slice/crio-b92d7c910113ca7731f1341fe96c66c9b8adf4f2552c8ba080026856f64ef501 WatchSource:0}: Error finding container b92d7c910113ca7731f1341fe96c66c9b8adf4f2552c8ba080026856f64ef501: Status 404 returned error can't find the container with id b92d7c910113ca7731f1341fe96c66c9b8adf4f2552c8ba080026856f64ef501 Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.012596 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.018093 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.026534 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.032379 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: W0127 00:09:27.038654 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86bc63c8_36a5_4c7a_bf5d_c8f93744a251.slice/crio-e0ac04bc30e5a5895615f837f5c07fb55b29ff08ef34335c00fb2b41eff339ca WatchSource:0}: Error finding container e0ac04bc30e5a5895615f837f5c07fb55b29ff08ef34335c00fb2b41eff339ca: Status 404 returned error can't find the container with id e0ac04bc30e5a5895615f837f5c07fb55b29ff08ef34335c00fb2b41eff339ca Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.039055 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.053716 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.054621 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.071379 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.089157 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w8qm5"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.091840 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.106808 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.113022 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.132279 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.153494 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.158718 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.165590 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gstl6"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.173077 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: W0127 00:09:27.193802 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc64d4fcd_470f_43e8_bab0_09c0b59eaf6d.slice/crio-f8f2d0f50c62531a6b84888ac44898fa7725445d28531a88079b90066b46c1bd WatchSource:0}: Error finding container f8f2d0f50c62531a6b84888ac44898fa7725445d28531a88079b90066b46c1bd: Status 404 returned error can't find the container with id f8f2d0f50c62531a6b84888ac44898fa7725445d28531a88079b90066b46c1bd Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.194065 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.207449 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" event={"ID":"7cab68e6-3113-4536-b5c3-2293265c9502","Type":"ContainerStarted","Data":"8dafee3215369519aaec1b82214b50f407e2311b31a51bec8c51f4d7d870a53c"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.209197 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" event={"ID":"1df48642-0cb9-4242-ae92-16b239115170","Type":"ContainerStarted","Data":"0d88e49e1a3e17cc6e532723981b1d1e24f28507b4012595a22573fdde0c7df3"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.210602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" event={"ID":"9166b710-4bb0-4fc0-8e54-45907543c22f","Type":"ContainerStarted","Data":"20e7f45e0bc628b7de64cc93ac19fdef43b151e10efd40aab961cc3102eb973d"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.212649 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.218152 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" event={"ID":"86bc63c8-36a5-4c7a-bf5d-c8f93744a251","Type":"ContainerStarted","Data":"e0ac04bc30e5a5895615f837f5c07fb55b29ff08ef34335c00fb2b41eff339ca"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.232125 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.243602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" event={"ID":"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d","Type":"ContainerStarted","Data":"f8f2d0f50c62531a6b84888ac44898fa7725445d28531a88079b90066b46c1bd"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.249487 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" event={"ID":"98696d49-1f48-4d0d-9e26-691558f704c5","Type":"ContainerStarted","Data":"b92d7c910113ca7731f1341fe96c66c9b8adf4f2552c8ba080026856f64ef501"} Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.252940 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.260389 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kv45n"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.272544 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.292290 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.306696 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.312536 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.332765 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.353206 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 00:09:27 crc kubenswrapper[4774]: W0127 00:09:27.368430 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6a02f8_e5e6_49b7_99cb_29d40ba9fdff.slice/crio-63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863 WatchSource:0}: Error finding container 63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863: Status 404 returned error can't find the container with id 63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863 Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.372852 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.392662 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.415148 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.434449 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.436694 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.454067 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.472145 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.472999 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-r9m95"] Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.492286 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.512125 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.533700 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.551617 4774 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.572145 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.592290 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.610537 4774 request.go:700] Waited for 1.913700516s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.612513 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.633330 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.652632 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.673040 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.694218 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.712881 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.760430 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckh7l\" (UniqueName: \"kubernetes.io/projected/ffb07ef8-54f3-448a-90ac-3da3e0d686ae-kube-api-access-ckh7l\") pod \"etcd-operator-b45778765-hm9kj\" (UID: \"ffb07ef8-54f3-448a-90ac-3da3e0d686ae\") " pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.775358 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") pod \"image-pruner-29491200-4blsg\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.785761 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.805796 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftjqz\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-kube-api-access-ftjqz\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.808101 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lnl\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-kube-api-access-b9lnl\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.829707 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79th\" (UniqueName: \"kubernetes.io/projected/ec68f29b-2b30-4dff-b875-7617466be51b-kube-api-access-q79th\") pod \"control-plane-machine-set-operator-78cbb6b69f-tjg8b\" (UID: \"ec68f29b-2b30-4dff-b875-7617466be51b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.848537 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc8l7\" (UniqueName: \"kubernetes.io/projected/475a4aef-33e7-40ad-b7c4-20a10efa4ec3-kube-api-access-cc8l7\") pod \"downloads-7954f5f757-nt6rp\" (UID: \"475a4aef-33e7-40ad-b7c4-20a10efa4ec3\") " pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.866370 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/459b67d7-cbf3-4890-a9a2-3e143ee459aa-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jc5dv\" (UID: \"459b67d7-cbf3-4890-a9a2-3e143ee459aa\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.885611 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.900283 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c9ce6bb-224d-498c-9299-762d84b0eaa3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8dbr9\" (UID: \"8c9ce6bb-224d-498c-9299-762d84b0eaa3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.917671 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244zr\" (UniqueName: \"kubernetes.io/projected/05b20433-569d-4f1d-acd8-127119a934e1-kube-api-access-244zr\") pod \"openshift-controller-manager-operator-756b6f6bc6-h7h7w\" (UID: \"05b20433-569d-4f1d-acd8-127119a934e1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.926810 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.932094 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhl8d\" (UniqueName: \"kubernetes.io/projected/7e83c76a-d44d-4a1b-b904-857dad56b5ba-kube-api-access-bhl8d\") pod \"openshift-apiserver-operator-796bbdcf4f-ddxk7\" (UID: \"7e83c76a-d44d-4a1b-b904-857dad56b5ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.941636 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.951974 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcd12b52-828d-4244-99e9-2291f3a0bbb6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-772k8\" (UID: \"dcd12b52-828d-4244-99e9-2291f3a0bbb6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.973292 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") pod \"oauth-openshift-558db77b4-brc4j\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:27 crc kubenswrapper[4774]: I0127 00:09:27.989325 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ea5ac78-6111-4f12-a2db-3a6f4e400d39-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rnh4z\" (UID: \"9ea5ac78-6111-4f12-a2db-3a6f4e400d39\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.017398 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4648\" (UniqueName: \"kubernetes.io/projected/e6f70216-3d55-4dd8-81f7-f2129a277407-kube-api-access-q4648\") pod \"openshift-config-operator-7777fb866f-vl2l2\" (UID: \"e6f70216-3d55-4dd8-81f7-f2129a277407\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.032697 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35026702-de7c-4f5f-8714-f1d7f89adae6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r8zvt\" (UID: \"35026702-de7c-4f5f-8714-f1d7f89adae6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.050387 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29491200-4blsg"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.068240 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwq54\" (UniqueName: \"kubernetes.io/projected/193a08ea-86ea-4176-898e-6a2c476ea6e9-kube-api-access-nwq54\") pod \"console-f9d7485db-776sg\" (UID: \"193a08ea-86ea-4176-898e-6a2c476ea6e9\") " pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.068604 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzc9x\" (UniqueName: \"kubernetes.io/projected/5be86044-916e-44ba-9855-360b0ae2471a-kube-api-access-fzc9x\") pod \"machine-config-operator-74547568cd-gf7zx\" (UID: \"5be86044-916e-44ba-9855-360b0ae2471a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.078086 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.096527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vrd\" (UniqueName: \"kubernetes.io/projected/607f973b-fc78-4c11-bc09-fdbbe414d8c7-kube-api-access-c9vrd\") pod \"router-default-5444994796-44dpr\" (UID: \"607f973b-fc78-4c11-bc09-fdbbe414d8c7\") " pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.099616 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.109284 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.119283 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.150684 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-hm9kj"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.158290 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.164507 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167197 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167255 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167301 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167353 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167383 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167410 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167436 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.167494 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.168257 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.668189366 +0000 UTC m=+146.973966240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.170246 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.178567 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.201773 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.209470 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.209908 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.217458 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.234297 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.269985 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.270124 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.77009517 +0000 UTC m=+147.075872054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270249 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/697daa16-5938-4757-a941-83fa9dbb019b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270333 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-mountpoint-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270360 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270428 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270476 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-profile-collector-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270504 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270575 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270637 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c89zl\" (UniqueName: \"kubernetes.io/projected/995e8bee-a3aa-466d-8007-4d12eab0d045-kube-api-access-c89zl\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270679 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bkh6\" (UniqueName: \"kubernetes.io/projected/31004ea4-e1fa-489e-a44e-701f370c9899-kube-api-access-4bkh6\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270700 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khrgg\" (UniqueName: \"kubernetes.io/projected/9e947476-a4bc-441e-97ab-2caba294339b-kube-api-access-khrgg\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270728 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270771 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wp9h\" (UniqueName: \"kubernetes.io/projected/697daa16-5938-4757-a941-83fa9dbb019b-kube-api-access-6wp9h\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270795 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270815 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-apiservice-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270884 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjzkw\" (UniqueName: \"kubernetes.io/projected/16b7780a-d64f-4a63-b462-c924d7c44aac-kube-api-access-tjzkw\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270939 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-socket-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.270959 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg6q\" (UniqueName: \"kubernetes.io/projected/337a9389-b527-41a3-aba6-fada15fb648c-kube-api-access-fgg6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.271006 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.271057 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e947476-a4bc-441e-97ab-2caba294339b-serving-cert\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.271156 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337a9389-b527-41a3-aba6-fada15fb648c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.272493 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.272641 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.772631568 +0000 UTC m=+147.078408452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.273714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.274180 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-tmpfs\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.274545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-srv-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.274802 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-cabundle\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276377 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-node-bootstrap-token\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276440 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276458 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-certs\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276487 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxff\" (UniqueName: \"kubernetes.io/projected/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-kube-api-access-gxxff\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-registration-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276565 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e947476-a4bc-441e-97ab-2caba294339b-config\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276646 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2fq5\" (UniqueName: \"kubernetes.io/projected/39fc27d8-358f-40a9-8128-35b2e8f96f5c-kube-api-access-t2fq5\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276707 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276731 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276755 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtcp4\" (UniqueName: \"kubernetes.io/projected/f6821801-4d19-46b9-8a38-e55810cb2dbf-kube-api-access-dtcp4\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276771 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-plugins-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276808 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-key\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276869 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276887 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47e6217b-24f0-495d-8c8e-cb684083e8dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276934 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-srv-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276952 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnnw2\" (UniqueName: \"kubernetes.io/projected/4718399b-22db-4443-bc52-22e461891f11-kube-api-access-mnnw2\") pod \"migrator-59844c95c7-97wdt\" (UID: \"4718399b-22db-4443-bc52-22e461891f11\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.276987 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vcnl\" (UniqueName: \"kubernetes.io/projected/75b25a91-05cf-44c9-b656-c8a07515d84f-kube-api-access-5vcnl\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277021 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995e8bee-a3aa-466d-8007-4d12eab0d045-metrics-tls\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277049 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfhft\" (UniqueName: \"kubernetes.io/projected/30b3f92f-3bf5-448c-9542-6217fb51f239-kube-api-access-sfhft\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psmsg\" (UniqueName: \"kubernetes.io/projected/7bfd245d-eb59-40d5-b1cf-517beaa46f32-kube-api-access-psmsg\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277083 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995e8bee-a3aa-466d-8007-4d12eab0d045-config-volume\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277109 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-webhook-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277170 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337a9389-b527-41a3-aba6-fada15fb648c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9pnb\" (UniqueName: \"kubernetes.io/projected/47e6217b-24f0-495d-8c8e-cb684083e8dd-kube-api-access-p9pnb\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277673 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-csi-data-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277746 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39fc27d8-358f-40a9-8128-35b2e8f96f5c-cert\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31004ea4-e1fa-489e-a44e-701f370c9899-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.277874 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31004ea4-e1fa-489e-a44e-701f370c9899-proxy-tls\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.284016 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.286690 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.288411 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.291537 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.311474 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.327416 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.337379 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" event={"ID":"7cb54417-3e44-4c36-a1cf-438a705f8dcf","Type":"ContainerStarted","Data":"9aec1831ac31422db43bdee5e6e742d2fdd80c086acbd5a9cc29da470da1d7eb"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.337432 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" event={"ID":"7cb54417-3e44-4c36-a1cf-438a705f8dcf","Type":"ContainerStarted","Data":"3c0392836f2dc327d5d8301bf0194f0bc3a8672e190262cb3c51dd36dcfb7803"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.337445 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" event={"ID":"7cb54417-3e44-4c36-a1cf-438a705f8dcf","Type":"ContainerStarted","Data":"f168fbc6b9f63899e12c4476d37e1c492cd33317d74be0e489ae286808a4ef57"} Jan 27 00:09:28 crc kubenswrapper[4774]: W0127 00:09:28.346775 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec68f29b_2b30_4dff_b875_7617466be51b.slice/crio-094c34af32cd5eaea1f5457aa5e6dd3fb684dafbcd4d452b53af81c3814744b1 WatchSource:0}: Error finding container 094c34af32cd5eaea1f5457aa5e6dd3fb684dafbcd4d452b53af81c3814744b1: Status 404 returned error can't find the container with id 094c34af32cd5eaea1f5457aa5e6dd3fb684dafbcd4d452b53af81c3814744b1 Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.378739 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.379577 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.879530804 +0000 UTC m=+147.185307688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.381889 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-socket-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.381942 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg6q\" (UniqueName: \"kubernetes.io/projected/337a9389-b527-41a3-aba6-fada15fb648c-kube-api-access-fgg6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.381976 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382015 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-tmpfs\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382039 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e947476-a4bc-441e-97ab-2caba294339b-serving-cert\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382062 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337a9389-b527-41a3-aba6-fada15fb648c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382093 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-srv-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382129 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-cabundle\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382157 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-node-bootstrap-token\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382180 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-certs\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382199 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxff\" (UniqueName: \"kubernetes.io/projected/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-kube-api-access-gxxff\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382222 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-registration-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382254 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e947476-a4bc-441e-97ab-2caba294339b-config\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382283 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2fq5\" (UniqueName: \"kubernetes.io/projected/39fc27d8-358f-40a9-8128-35b2e8f96f5c-kube-api-access-t2fq5\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382311 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382334 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtcp4\" (UniqueName: \"kubernetes.io/projected/f6821801-4d19-46b9-8a38-e55810cb2dbf-kube-api-access-dtcp4\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382355 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-plugins-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382376 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-key\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382398 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47e6217b-24f0-495d-8c8e-cb684083e8dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382423 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382445 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-srv-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382468 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnnw2\" (UniqueName: \"kubernetes.io/projected/4718399b-22db-4443-bc52-22e461891f11-kube-api-access-mnnw2\") pod \"migrator-59844c95c7-97wdt\" (UID: \"4718399b-22db-4443-bc52-22e461891f11\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382513 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vcnl\" (UniqueName: \"kubernetes.io/projected/75b25a91-05cf-44c9-b656-c8a07515d84f-kube-api-access-5vcnl\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382550 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995e8bee-a3aa-466d-8007-4d12eab0d045-metrics-tls\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382582 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfhft\" (UniqueName: \"kubernetes.io/projected/30b3f92f-3bf5-448c-9542-6217fb51f239-kube-api-access-sfhft\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382610 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psmsg\" (UniqueName: \"kubernetes.io/projected/7bfd245d-eb59-40d5-b1cf-517beaa46f32-kube-api-access-psmsg\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382635 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995e8bee-a3aa-466d-8007-4d12eab0d045-config-volume\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382646 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-tmpfs\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382677 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-webhook-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382756 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337a9389-b527-41a3-aba6-fada15fb648c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382792 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9pnb\" (UniqueName: \"kubernetes.io/projected/47e6217b-24f0-495d-8c8e-cb684083e8dd-kube-api-access-p9pnb\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382844 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39fc27d8-358f-40a9-8128-35b2e8f96f5c-cert\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382913 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-csi-data-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382947 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31004ea4-e1fa-489e-a44e-701f370c9899-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.382991 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31004ea4-e1fa-489e-a44e-701f370c9899-proxy-tls\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383024 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/697daa16-5938-4757-a941-83fa9dbb019b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383088 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-mountpoint-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383117 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383158 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-profile-collector-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383188 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383291 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383323 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c89zl\" (UniqueName: \"kubernetes.io/projected/995e8bee-a3aa-466d-8007-4d12eab0d045-kube-api-access-c89zl\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383356 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bkh6\" (UniqueName: \"kubernetes.io/projected/31004ea4-e1fa-489e-a44e-701f370c9899-kube-api-access-4bkh6\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383378 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khrgg\" (UniqueName: \"kubernetes.io/projected/9e947476-a4bc-441e-97ab-2caba294339b-kube-api-access-khrgg\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383403 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383424 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wp9h\" (UniqueName: \"kubernetes.io/projected/697daa16-5938-4757-a941-83fa9dbb019b-kube-api-access-6wp9h\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383451 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-apiservice-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383483 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjzkw\" (UniqueName: \"kubernetes.io/projected/16b7780a-d64f-4a63-b462-c924d7c44aac-kube-api-access-tjzkw\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.383983 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-socket-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.384549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e947476-a4bc-441e-97ab-2caba294339b-config\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.384697 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-registration-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.384797 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/337a9389-b527-41a3-aba6-fada15fb648c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.385324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-cabundle\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.386825 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-plugins-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.387265 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.387638 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.887620551 +0000 UTC m=+147.193397435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.388233 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.389178 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-csi-data-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.389504 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/16b7780a-d64f-4a63-b462-c924d7c44aac-mountpoint-dir\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.390102 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31004ea4-e1fa-489e-a44e-701f370c9899-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.390604 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995e8bee-a3aa-466d-8007-4d12eab0d045-config-volume\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.395187 4774 generic.go:334] "Generic (PLEG): container finished" podID="86bc63c8-36a5-4c7a-bf5d-c8f93744a251" containerID="2769c007da48b10c76f3468831c07b445783b7340d1320618fcd224dd716268a" exitCode=0 Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.403253 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-srv-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.405775 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39fc27d8-358f-40a9-8128-35b2e8f96f5c-cert\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.424357 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e947476-a4bc-441e-97ab-2caba294339b-serving-cert\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.424462 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425019 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-srv-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.424926 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/47e6217b-24f0-495d-8c8e-cb684083e8dd-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425440 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f6821801-4d19-46b9-8a38-e55810cb2dbf-signing-key\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425489 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31004ea4-e1fa-489e-a44e-701f370c9899-proxy-tls\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425500 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/337a9389-b527-41a3-aba6-fada15fb648c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425659 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/995e8bee-a3aa-466d-8007-4d12eab0d045-metrics-tls\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.425723 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-certs\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.433057 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30b3f92f-3bf5-448c-9542-6217fb51f239-node-bootstrap-token\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.434852 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/697daa16-5938-4757-a941-83fa9dbb019b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.436311 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-apiservice-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.437161 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.439999 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-webhook-cert\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.448333 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/75b25a91-05cf-44c9-b656-c8a07515d84f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.448904 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bfd245d-eb59-40d5-b1cf-517beaa46f32-profile-collector-cert\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.467407 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjzkw\" (UniqueName: \"kubernetes.io/projected/16b7780a-d64f-4a63-b462-c924d7c44aac-kube-api-access-tjzkw\") pod \"csi-hostpathplugin-m59xl\" (UID: \"16b7780a-d64f-4a63-b462-c924d7c44aac\") " pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.470778 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.476872 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg6q\" (UniqueName: \"kubernetes.io/projected/337a9389-b527-41a3-aba6-fada15fb648c-kube-api-access-fgg6q\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7qfw\" (UID: \"337a9389-b527-41a3-aba6-fada15fb648c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.482163 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") pod \"collect-profiles-29491200-fxl4m\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.484558 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.485616 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:28.985597105 +0000 UTC m=+147.291373989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.494664 4774 generic.go:334] "Generic (PLEG): container finished" podID="98696d49-1f48-4d0d-9e26-691558f704c5" containerID="461f3c798ef2ec6334ea60d55994e80867f8d705a04bf8eecd05f5e88be89c4e" exitCode=0 Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.502012 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxff\" (UniqueName: \"kubernetes.io/projected/959cd32e-9b14-4df3-aadf-d51b5a5a3c14-kube-api-access-gxxff\") pod \"packageserver-d55dfcdfc-4btn7\" (UID: \"959cd32e-9b14-4df3-aadf-d51b5a5a3c14\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.507226 4774 patch_prober.go:28] interesting pod/console-operator-58897d9998-r9m95 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.507318 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-r9m95" podUID="956bdf0f-108e-4966-bbb6-77be23255cec" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" event={"ID":"9166b710-4bb0-4fc0-8e54-45907543c22f","Type":"ContainerStarted","Data":"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517315 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" event={"ID":"ffb07ef8-54f3-448a-90ac-3da3e0d686ae","Type":"ContainerStarted","Data":"a28eab8188e8572bd395124f34aefe010e6daef7c7d23fae32440486658bfe60"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517327 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" event={"ID":"86bc63c8-36a5-4c7a-bf5d-c8f93744a251","Type":"ContainerDied","Data":"2769c007da48b10c76f3468831c07b445783b7340d1320618fcd224dd716268a"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517346 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517379 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517391 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517401 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" event={"ID":"1df48642-0cb9-4242-ae92-16b239115170","Type":"ContainerStarted","Data":"1de92e5e9565cd6e3aa14a05222aa73a7ad801168301409f5941e309948a72d5"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517416 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" event={"ID":"1df48642-0cb9-4242-ae92-16b239115170","Type":"ContainerStarted","Data":"96683b71050d117ee8ad13ce0ae940496549a206d325d504ab0af76a45f881f7"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517429 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517439 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29491200-4blsg" event={"ID":"518a161e-aeab-4ad6-a2c0-dee7ec963958","Type":"ContainerStarted","Data":"b36612e805e28ae8902021d1d0300be7876973ace2dc30015657a43e1a1a97ef"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517450 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" event={"ID":"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff","Type":"ContainerStarted","Data":"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517496 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517510 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" event={"ID":"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff","Type":"ContainerStarted","Data":"63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517525 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r9m95" event={"ID":"956bdf0f-108e-4966-bbb6-77be23255cec","Type":"ContainerStarted","Data":"756673614d2902e6a381b31700f9a1ac56ab898d15ecff5022830f75378c833b"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517536 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-r9m95" event={"ID":"956bdf0f-108e-4966-bbb6-77be23255cec","Type":"ContainerStarted","Data":"fb91cd0fb7c4d8b171636ad72da91302e616bee195a11f3c8f1a5c8c1902137e"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517547 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" event={"ID":"c64d4fcd-470f-43e8-bab0-09c0b59eaf6d","Type":"ContainerStarted","Data":"6650f4000967cdc6ada4b19f88ec6b9715afcbdced999fcdd492d510fe5cf68b"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" event={"ID":"98696d49-1f48-4d0d-9e26-691558f704c5","Type":"ContainerDied","Data":"461f3c798ef2ec6334ea60d55994e80867f8d705a04bf8eecd05f5e88be89c4e"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" event={"ID":"db8adc8a-118c-485d-b752-c3d0a2888458","Type":"ContainerStarted","Data":"f5ca3c51b73b4beb661b006f96edfb94984f0c9745405482264fade592c171ab"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517585 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" event={"ID":"db8adc8a-118c-485d-b752-c3d0a2888458","Type":"ContainerStarted","Data":"83f2acf12429f975ed1f532b946811b0fc7dee767180231b6f6f582cce037c72"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.517595 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" event={"ID":"db8adc8a-118c-485d-b752-c3d0a2888458","Type":"ContainerStarted","Data":"232abda2c862707711617c9c81505a0449f93367cd5eeae1187998aebd758d6a"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.519555 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9pnb\" (UniqueName: \"kubernetes.io/projected/47e6217b-24f0-495d-8c8e-cb684083e8dd-kube-api-access-p9pnb\") pod \"multus-admission-controller-857f4d67dd-8z5sz\" (UID: \"47e6217b-24f0-495d-8c8e-cb684083e8dd\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.524682 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" event={"ID":"7cab68e6-3113-4536-b5c3-2293265c9502","Type":"ContainerStarted","Data":"388df883c464ffdffeef6f390a80ef8d644a2dddb924b82d9d055aea32f91719"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.524750 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" event={"ID":"7cab68e6-3113-4536-b5c3-2293265c9502","Type":"ContainerStarted","Data":"4310ad87f99a35e18e06c9727f3648526226a191c0798ed5c5189f02060bf3c8"} Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.542544 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2fq5\" (UniqueName: \"kubernetes.io/projected/39fc27d8-358f-40a9-8128-35b2e8f96f5c-kube-api-access-t2fq5\") pod \"ingress-canary-ttgpc\" (UID: \"39fc27d8-358f-40a9-8128-35b2e8f96f5c\") " pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.548909 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnnw2\" (UniqueName: \"kubernetes.io/projected/4718399b-22db-4443-bc52-22e461891f11-kube-api-access-mnnw2\") pod \"migrator-59844c95c7-97wdt\" (UID: \"4718399b-22db-4443-bc52-22e461891f11\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.562485 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.580061 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.588605 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.589570 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.089543981 +0000 UTC m=+147.395321005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.603932 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtcp4\" (UniqueName: \"kubernetes.io/projected/f6821801-4d19-46b9-8a38-e55810cb2dbf-kube-api-access-dtcp4\") pod \"service-ca-9c57cc56f-8kqjp\" (UID: \"f6821801-4d19-46b9-8a38-e55810cb2dbf\") " pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.614591 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vcnl\" (UniqueName: \"kubernetes.io/projected/75b25a91-05cf-44c9-b656-c8a07515d84f-kube-api-access-5vcnl\") pod \"olm-operator-6b444d44fb-4txdd\" (UID: \"75b25a91-05cf-44c9-b656-c8a07515d84f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.615286 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.629146 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.634716 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psmsg\" (UniqueName: \"kubernetes.io/projected/7bfd245d-eb59-40d5-b1cf-517beaa46f32-kube-api-access-psmsg\") pod \"catalog-operator-68c6474976-bgm7n\" (UID: \"7bfd245d-eb59-40d5-b1cf-517beaa46f32\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.638018 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfhft\" (UniqueName: \"kubernetes.io/projected/30b3f92f-3bf5-448c-9542-6217fb51f239-kube-api-access-sfhft\") pod \"machine-config-server-pjtrg\" (UID: \"30b3f92f-3bf5-448c-9542-6217fb51f239\") " pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.647106 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.652072 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bkh6\" (UniqueName: \"kubernetes.io/projected/31004ea4-e1fa-489e-a44e-701f370c9899-kube-api-access-4bkh6\") pod \"machine-config-controller-84d6567774-dhlhj\" (UID: \"31004ea4-e1fa-489e-a44e-701f370c9899\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.665292 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.668743 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") pod \"marketplace-operator-79b997595-vxtrq\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.692343 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.692952 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.19292755 +0000 UTC m=+147.498704434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.696066 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c89zl\" (UniqueName: \"kubernetes.io/projected/995e8bee-a3aa-466d-8007-4d12eab0d045-kube-api-access-c89zl\") pod \"dns-default-th968\" (UID: \"995e8bee-a3aa-466d-8007-4d12eab0d045\") " pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.717948 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.719528 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wp9h\" (UniqueName: \"kubernetes.io/projected/697daa16-5938-4757-a941-83fa9dbb019b-kube-api-access-6wp9h\") pod \"package-server-manager-789f6589d5-49cqc\" (UID: \"697daa16-5938-4757-a941-83fa9dbb019b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.733427 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.746682 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khrgg\" (UniqueName: \"kubernetes.io/projected/9e947476-a4bc-441e-97ab-2caba294339b-kube-api-access-khrgg\") pod \"service-ca-operator-777779d784-cz4cq\" (UID: \"9e947476-a4bc-441e-97ab-2caba294339b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.758299 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ttgpc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.781841 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.782189 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-th968" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.800076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.800903 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pjtrg" Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.809026 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.309011758 +0000 UTC m=+147.614788642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.860156 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.881151 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.894264 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.905029 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.906848 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:28 crc kubenswrapper[4774]: E0127 00:09:28.909921 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.409902871 +0000 UTC m=+147.715679755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.937078 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7"] Jan 27 00:09:28 crc kubenswrapper[4774]: I0127 00:09:28.939270 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kv45n" podStartSLOduration=126.939253157 podStartE2EDuration="2m6.939253157s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:28.93440256 +0000 UTC m=+147.240179444" watchObservedRunningTime="2026-01-27 00:09:28.939253157 +0000 UTC m=+147.245030041" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.012535 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.012828 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.512816946 +0000 UTC m=+147.818593830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.113972 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.114935 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.614911265 +0000 UTC m=+147.920688159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.199888 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.216477 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.216906 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.716891931 +0000 UTC m=+148.022668815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.230035 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nt6rp"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.233376 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.317790 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.320235 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.320807 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.820762196 +0000 UTC m=+148.126539080 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.321044 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.321968 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.821955611 +0000 UTC m=+148.127732495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.340345 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" podStartSLOduration=127.340323763 podStartE2EDuration="2m7.340323763s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:29.339612911 +0000 UTC m=+147.645389805" watchObservedRunningTime="2026-01-27 00:09:29.340323763 +0000 UTC m=+147.646100657" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.371246 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.373281 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.423145 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.423823 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:29.923784793 +0000 UTC m=+148.229561677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.429834 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-r9m95" podStartSLOduration=127.429812457 podStartE2EDuration="2m7.429812457s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:29.429534929 +0000 UTC m=+147.735311813" watchObservedRunningTime="2026-01-27 00:09:29.429812457 +0000 UTC m=+147.735589351" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.465628 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.496724 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-776sg"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.526089 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.526479 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.02646238 +0000 UTC m=+148.332239254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.591917 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" event={"ID":"86bc63c8-36a5-4c7a-bf5d-c8f93744a251","Type":"ContainerStarted","Data":"644c4861c08ce86fe753e4bd8cd17c371bbebaa8fe5ede236b2b6fcdff137bce"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.594281 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.598344 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" event={"ID":"05b20433-569d-4f1d-acd8-127119a934e1","Type":"ContainerStarted","Data":"e6d3f7de559282c231a4c916b4f48e91e390980192259b8dbf4a4779fe3268e6"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.627712 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.628098 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.128081836 +0000 UTC m=+148.433858710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.628437 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" event={"ID":"98696d49-1f48-4d0d-9e26-691558f704c5","Type":"ContainerStarted","Data":"0a57d0fb8a5602d47a075a2fd2b2312032aedb31af42a2e20363da20acb2c7e5"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.635270 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" event={"ID":"7e83c76a-d44d-4a1b-b904-857dad56b5ba","Type":"ContainerStarted","Data":"eee927cec084ec215de7c85deb5cb412ce92a14eed54c1852a7c707bfd2a57b6"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.650006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" event={"ID":"ec68f29b-2b30-4dff-b875-7617466be51b","Type":"ContainerStarted","Data":"9829d5d366c15f8a211861ade046b712e6de98c79ae44758cbcfcef44c84b8d1"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.650071 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" event={"ID":"ec68f29b-2b30-4dff-b875-7617466be51b","Type":"ContainerStarted","Data":"094c34af32cd5eaea1f5457aa5e6dd3fb684dafbcd4d452b53af81c3814744b1"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.654352 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" event={"ID":"ffb07ef8-54f3-448a-90ac-3da3e0d686ae","Type":"ContainerStarted","Data":"26cb23b34a6f56166c5bf2249e54d83f725ea2e7cf7b0f1467fbf6422bae9b61"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.657896 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" event={"ID":"8c9ce6bb-224d-498c-9299-762d84b0eaa3","Type":"ContainerStarted","Data":"a42df0ff4fd1f9cfef43be24bbd8168ac2a4b616d00beadad8b9eb3134007fb5"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.658893 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" event={"ID":"f8bb8619-eac4-481e-bdb8-5fb5985c1844","Type":"ContainerStarted","Data":"6c6a9cbe32487a19776f84a95c618edb7f8a9a884ffdc154828b802d8fe8a819"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.666603 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nt6rp" event={"ID":"475a4aef-33e7-40ad-b7c4-20a10efa4ec3","Type":"ContainerStarted","Data":"c3306b03c14de85dd13d3daee11acc3abe88542bfecef256ec380233752ab420"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.675072 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29491200-4blsg" event={"ID":"518a161e-aeab-4ad6-a2c0-dee7ec963958","Type":"ContainerStarted","Data":"0291aad17e7875c8c8bae7dffadc2698234bc0f4a826218ffa06fe13e248194a"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.680696 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-m59xl"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.689439 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" event={"ID":"e6f70216-3d55-4dd8-81f7-f2129a277407","Type":"ContainerStarted","Data":"428513d6e1f3ea985e70ec211b62da2cc65ea52ed5f48867223a9603cf158e70"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.711430 4774 csr.go:261] certificate signing request csr-26lcx is approved, waiting to be issued Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.725100 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" event={"ID":"459b67d7-cbf3-4890-a9a2-3e143ee459aa","Type":"ContainerStarted","Data":"009aae102f3825eab8a6d060ac3aeb4bbb12d9d6f4d21d29c5d213945d39be76"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.725153 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" event={"ID":"459b67d7-cbf3-4890-a9a2-3e143ee459aa","Type":"ContainerStarted","Data":"d217ac2925ef3ec95a52417166140dac43c7082f20dbdab3f1b0abf42a7a717e"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.725945 4774 csr.go:257] certificate signing request csr-26lcx is issued Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.730650 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-44dpr" event={"ID":"607f973b-fc78-4c11-bc09-fdbbe414d8c7","Type":"ContainerStarted","Data":"31c87a87bbb038977e219e5717d5635ed2ae9ac45791250d6a6f3fd630ae34ff"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.730700 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-44dpr" event={"ID":"607f973b-fc78-4c11-bc09-fdbbe414d8c7","Type":"ContainerStarted","Data":"9d72c9f2751184146cfe5ecec6413cd81a71ffdab0d5232d298c577bcd7c0051"} Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.739235 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.742166 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-w8qm5" podStartSLOduration=127.742144732 podStartE2EDuration="2m7.742144732s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:29.707203574 +0000 UTC m=+148.012980458" watchObservedRunningTime="2026-01-27 00:09:29.742144732 +0000 UTC m=+148.047921616" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.743343 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.243320377 +0000 UTC m=+148.549097261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.794973 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z"] Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.842477 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.842715 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.342675773 +0000 UTC m=+148.648452657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.843325 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.845725 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.345705915 +0000 UTC m=+148.651482799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.888087 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" podStartSLOduration=127.88805557 podStartE2EDuration="2m7.88805557s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:29.821249808 +0000 UTC m=+148.127026692" watchObservedRunningTime="2026-01-27 00:09:29.88805557 +0000 UTC m=+148.193832454" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.918125 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-r9m95" Jan 27 00:09:29 crc kubenswrapper[4774]: I0127 00:09:29.952016 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:29 crc kubenswrapper[4774]: E0127 00:09:29.952684 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.452665374 +0000 UTC m=+148.758442258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.053984 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.054456 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.554436834 +0000 UTC m=+148.860213738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.065357 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-p6qnr" podStartSLOduration=128.065334006 podStartE2EDuration="2m8.065334006s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.030138971 +0000 UTC m=+148.335915855" watchObservedRunningTime="2026-01-27 00:09:30.065334006 +0000 UTC m=+148.371110890" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.158877 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.159301 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.659279577 +0000 UTC m=+148.965056461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: W0127 00:09:30.188125 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea5ac78_6111_4f12_a2db_3a6f4e400d39.slice/crio-7fb06d0c305e1fd99059b213ae5d901b5e971d2d6ca86c96f64c7544a2b440aa WatchSource:0}: Error finding container 7fb06d0c305e1fd99059b213ae5d901b5e971d2d6ca86c96f64c7544a2b440aa: Status 404 returned error can't find the container with id 7fb06d0c305e1fd99059b213ae5d901b5e971d2d6ca86c96f64c7544a2b440aa Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.218404 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.260275 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.260327 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.260396 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.262509 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.762486141 +0000 UTC m=+149.068263035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.263989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.290917 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.307082 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:30 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:30 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:30 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.307151 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.361718 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.362111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.362172 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.365114 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.865063635 +0000 UTC m=+149.170840519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.406218 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.429775 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.437617 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.465221 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.465623 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:30.965608848 +0000 UTC m=+149.271385732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.490732 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-gstl6" podStartSLOduration=128.490708555 podStartE2EDuration="2m8.490708555s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.464148873 +0000 UTC m=+148.769925767" watchObservedRunningTime="2026-01-27 00:09:30.490708555 +0000 UTC m=+148.796485439" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.568182 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.568747 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.068719589 +0000 UTC m=+149.374496473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.584510 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5rsb9" podStartSLOduration=128.58449293 podStartE2EDuration="2m8.58449293s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.582396467 +0000 UTC m=+148.888173351" watchObservedRunningTime="2026-01-27 00:09:30.58449293 +0000 UTC m=+148.890269814" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.627128 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw"] Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.672700 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.673075 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.173062287 +0000 UTC m=+149.478839171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.685257 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.709759 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-44dpr" podStartSLOduration=128.709739918 podStartE2EDuration="2m8.709739918s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.672418307 +0000 UTC m=+148.978195191" watchObservedRunningTime="2026-01-27 00:09:30.709739918 +0000 UTC m=+149.015516802" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.725220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.730371 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-27 00:04:29 +0000 UTC, rotation deadline is 2026-11-04 01:54:12.151842091 +0000 UTC Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.730411 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6745h44m41.421433422s for next certificate rotation Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.750400 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-hm9kj" podStartSLOduration=128.75037835 podStartE2EDuration="2m8.75037835s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.74611992 +0000 UTC m=+149.051896804" watchObservedRunningTime="2026-01-27 00:09:30.75037835 +0000 UTC m=+149.056155234" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.758721 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jc5dv" podStartSLOduration=128.758689694 podStartE2EDuration="2m8.758689694s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.710394108 +0000 UTC m=+149.016170992" watchObservedRunningTime="2026-01-27 00:09:30.758689694 +0000 UTC m=+149.064466588" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.773457 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.775172 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.275151536 +0000 UTC m=+149.580928420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.847932 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" event={"ID":"9ea5ac78-6111-4f12-a2db-3a6f4e400d39","Type":"ContainerStarted","Data":"7fb06d0c305e1fd99059b213ae5d901b5e971d2d6ca86c96f64c7544a2b440aa"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.866751 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tjg8b" podStartSLOduration=128.866728855 podStartE2EDuration="2m8.866728855s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.8642864 +0000 UTC m=+149.170063284" watchObservedRunningTime="2026-01-27 00:09:30.866728855 +0000 UTC m=+149.172505739" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.867178 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29491200-4blsg" podStartSLOduration=128.867172268 podStartE2EDuration="2m8.867172268s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.8083056 +0000 UTC m=+149.114082514" watchObservedRunningTime="2026-01-27 00:09:30.867172268 +0000 UTC m=+149.172949152" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.899515 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-776sg" event={"ID":"193a08ea-86ea-4176-898e-6a2c476ea6e9","Type":"ContainerStarted","Data":"76c579a20c933610cbedd0ccf1a09d14df4cfd53b16f63aad173c5564069f9ba"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.900998 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:30 crc kubenswrapper[4774]: E0127 00:09:30.901335 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.401325191 +0000 UTC m=+149.707102075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.913214 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pjtrg" event={"ID":"30b3f92f-3bf5-448c-9542-6217fb51f239","Type":"ContainerStarted","Data":"a9c15889495054b6278cabf6b04a47f7ab566759beeb475114b78141e6a616fd"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.915521 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" event={"ID":"35026702-de7c-4f5f-8714-f1d7f89adae6","Type":"ContainerStarted","Data":"fcfeb7938157beba9e8fd3ea370ef1873a89ecd03a28947a3e88448549036f32"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.933742 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" event={"ID":"8c9ce6bb-224d-498c-9299-762d84b0eaa3","Type":"ContainerStarted","Data":"28b5da5a446fe8b0be06271f31998faa4f7613cd88185a5550982f6d5631f536"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.940435 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8z5sz"] Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.952586 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" event={"ID":"dcd12b52-828d-4244-99e9-2291f3a0bbb6","Type":"ContainerStarted","Data":"071506ac8e112236d78c54daccebafd4d82846318aa8cd3fac2efdaffc03e7fc"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.952635 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"ef636c2e64d05f81ed5455b2e615257d8560a2f754d13edaafaea28c35ce7218"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.952648 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" event={"ID":"5be86044-916e-44ba-9855-360b0ae2471a","Type":"ContainerStarted","Data":"0e425d6385bd827f0a6fa6c5fa9778a63a3d93117d41e2eb9dd4b0493eda1941"} Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.982077 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8dbr9" podStartSLOduration=128.982045238 podStartE2EDuration="2m8.982045238s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:30.978086637 +0000 UTC m=+149.283863521" watchObservedRunningTime="2026-01-27 00:09:30.982045238 +0000 UTC m=+149.287822142" Jan 27 00:09:30 crc kubenswrapper[4774]: I0127 00:09:30.989948 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.001831 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.002310 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.502293108 +0000 UTC m=+149.808069992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.105887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.108312 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.608295226 +0000 UTC m=+149.914072110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.207134 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.208593 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.70857169 +0000 UTC m=+150.014348574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.208698 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.209328 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.709318463 +0000 UTC m=+150.015095337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.245903 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:31 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:31 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:31 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.253518 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.310839 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.311535 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.811517646 +0000 UTC m=+150.117294520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.368077 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" podStartSLOduration=129.368054134 podStartE2EDuration="2m9.368054134s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:31.001282216 +0000 UTC m=+149.307059100" watchObservedRunningTime="2026-01-27 00:09:31.368054134 +0000 UTC m=+149.673831018" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.372943 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.413783 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.414124 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:31.914110161 +0000 UTC m=+150.219887055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: W0127 00:09:31.432933 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4718399b_22db_4443_bc52_22e461891f11.slice/crio-d6ba34ea1aa05c64666fc87c9dcb27303fa7a6c055d2ffd300b16be709449b57 WatchSource:0}: Error finding container d6ba34ea1aa05c64666fc87c9dcb27303fa7a6c055d2ffd300b16be709449b57: Status 404 returned error can't find the container with id d6ba34ea1aa05c64666fc87c9dcb27303fa7a6c055d2ffd300b16be709449b57 Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.517711 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.519030 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.019013007 +0000 UTC m=+150.324789891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.545480 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ttgpc"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.567192 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.587264 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.601497 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.620148 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.620870 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.120843488 +0000 UTC m=+150.426620372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: W0127 00:09:31.648071 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e947476_a4bc_441e_97ab_2caba294339b.slice/crio-b859356ece73a046879bd4a66087cc3d8c676f3589c47e257ad8b7771aa7b303 WatchSource:0}: Error finding container b859356ece73a046879bd4a66087cc3d8c676f3589c47e257ad8b7771aa7b303: Status 404 returned error can't find the container with id b859356ece73a046879bd4a66087cc3d8c676f3589c47e257ad8b7771aa7b303 Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.723260 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.723565 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.223531696 +0000 UTC m=+150.529308620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.762845 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.764690 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.786345 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.797456 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.811901 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.825946 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.826292 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.326278015 +0000 UTC m=+150.632054899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.833677 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.856152 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-th968"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.901442 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8kqjp"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.926844 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.927134 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.427110536 +0000 UTC m=+150.732887420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.927524 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:31 crc kubenswrapper[4774]: E0127 00:09:31.927889 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.42787397 +0000 UTC m=+150.733650854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.932850 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n"] Jan 27 00:09:31 crc kubenswrapper[4774]: I0127 00:09:31.983279 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th968" event={"ID":"995e8bee-a3aa-466d-8007-4d12eab0d045","Type":"ContainerStarted","Data":"253dc5936ce6f313e58f919994fd42f5addeb0a09d3a3889e2238b4b73343f75"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.003259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" event={"ID":"75b25a91-05cf-44c9-b656-c8a07515d84f","Type":"ContainerStarted","Data":"5667fec29144ced5f25dd25bd78d8faf881c9c538dc4f6142b5ef1049cc27559"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.003342 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" event={"ID":"75b25a91-05cf-44c9-b656-c8a07515d84f","Type":"ContainerStarted","Data":"5d53a47b6727ca457243733c5afb6f47f1e0656cee8b6ed1c9aaadb22b92c794"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.004940 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.011123 4774 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4txdd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.011214 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" podUID="75b25a91-05cf-44c9-b656-c8a07515d84f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.025092 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" event={"ID":"9ea5ac78-6111-4f12-a2db-3a6f4e400d39","Type":"ContainerStarted","Data":"3f0374186addd2423546b11bdbacd76d8615c4e447b7c172f220d14c48166675"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.029544 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.029983 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.529965169 +0000 UTC m=+150.835742053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.034726 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" podStartSLOduration=130.034644542 podStartE2EDuration="2m10.034644542s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.03099351 +0000 UTC m=+150.336770394" watchObservedRunningTime="2026-01-27 00:09:32.034644542 +0000 UTC m=+150.340421426" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.035142 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3571959ac4a5cb62d895465a103b5a5233d7721540e7f3d218cd9aec19eac2fd"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.044514 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" event={"ID":"9e947476-a4bc-441e-97ab-2caba294339b","Type":"ContainerStarted","Data":"b859356ece73a046879bd4a66087cc3d8c676f3589c47e257ad8b7771aa7b303"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.052319 4774 generic.go:334] "Generic (PLEG): container finished" podID="e6f70216-3d55-4dd8-81f7-f2129a277407" containerID="891ae5323767ab807707955951556a8047d50847bb6a7c6c3865b62a00f99e38" exitCode=0 Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.052425 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" event={"ID":"e6f70216-3d55-4dd8-81f7-f2129a277407","Type":"ContainerDied","Data":"891ae5323767ab807707955951556a8047d50847bb6a7c6c3865b62a00f99e38"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.059548 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rnh4z" podStartSLOduration=130.059505722 podStartE2EDuration="2m10.059505722s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.054222441 +0000 UTC m=+150.359999325" watchObservedRunningTime="2026-01-27 00:09:32.059505722 +0000 UTC m=+150.365282606" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.085349 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" event={"ID":"697daa16-5938-4757-a941-83fa9dbb019b","Type":"ContainerStarted","Data":"a927f23907f42e1cbcad0ee3147701d27155472a79bd400df20e0ba7d71a81c1"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.104403 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"3ddc87762ac7bc7162f2460dc71b0e9912f687657344ce7c680463a350a9d5f5"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.114950 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" event={"ID":"47e6217b-24f0-495d-8c8e-cb684083e8dd","Type":"ContainerStarted","Data":"7455dfbc1045f78c76a5a76da0693c0c226b541b9c267d62cebb81b21e93b9a1"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.132145 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.134293 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.634275517 +0000 UTC m=+150.940052401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.134598 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" event={"ID":"5be86044-916e-44ba-9855-360b0ae2471a","Type":"ContainerStarted","Data":"c2257101fa8cb5a675a5ab94b7d961a6634b3f8cb8296455232bcdda9f55f1ba"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.139707 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" event={"ID":"dcd12b52-828d-4244-99e9-2291f3a0bbb6","Type":"ContainerStarted","Data":"2510ce46ee90a783abedcd5bc9344ef584d36b4177a94213a199cca876f9dff7"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.153065 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" event={"ID":"98696d49-1f48-4d0d-9e26-691558f704c5","Type":"ContainerStarted","Data":"eb26eb479e0175ce267ac71c1b6249f0846c485d772ab1577426c90f5324ce03"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.175490 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" event={"ID":"4718399b-22db-4443-bc52-22e461891f11","Type":"ContainerStarted","Data":"d6ba34ea1aa05c64666fc87c9dcb27303fa7a6c055d2ffd300b16be709449b57"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.196365 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" event={"ID":"337a9389-b527-41a3-aba6-fada15fb648c","Type":"ContainerStarted","Data":"a99eea7bcb77baeb35c5dc052554c1290e8eec8cc836d4b0fc5e5531efd66db7"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.196423 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" event={"ID":"337a9389-b527-41a3-aba6-fada15fb648c","Type":"ContainerStarted","Data":"3d7c142fabd8a7c17405f161de3a672d2f489778ee7da90531cdd489c533d11c"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.207748 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nt6rp" event={"ID":"475a4aef-33e7-40ad-b7c4-20a10efa4ec3","Type":"ContainerStarted","Data":"5901298c40847fba572d02477eba5ed986950dbd27fc2f5646d651b5af5be153"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.209221 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.218115 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.218198 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.224730 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:32 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:32 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:32 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.224792 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.231841 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" event={"ID":"31004ea4-e1fa-489e-a44e-701f370c9899","Type":"ContainerStarted","Data":"ba80ccf411fcdfbf653add4e741aff9d9ad99581931e0786a6e518c6f10a4d64"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.232197 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" podStartSLOduration=130.232173888 podStartE2EDuration="2m10.232173888s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.23190445 +0000 UTC m=+150.537681354" watchObservedRunningTime="2026-01-27 00:09:32.232173888 +0000 UTC m=+150.537950762" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.233408 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.234544 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.73452775 +0000 UTC m=+151.040304624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.237517 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-772k8" podStartSLOduration=130.237504681 podStartE2EDuration="2m10.237504681s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.165688156 +0000 UTC m=+150.471465050" watchObservedRunningTime="2026-01-27 00:09:32.237504681 +0000 UTC m=+150.543281565" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.274411 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-776sg" event={"ID":"193a08ea-86ea-4176-898e-6a2c476ea6e9","Type":"ContainerStarted","Data":"bfd7ace510f6eb2c92ab1cedcac676d157af0bdab38fdfc57c7b84c3760dbd02"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.290559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ttgpc" event={"ID":"39fc27d8-358f-40a9-8128-35b2e8f96f5c","Type":"ContainerStarted","Data":"44bd72fd9140eea0820a67809412aa6f424d95f0545367674cbaa599f66dc752"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.294709 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" event={"ID":"959cd32e-9b14-4df3-aadf-d51b5a5a3c14","Type":"ContainerStarted","Data":"1cb130ec19ed17fb84b3d80eb2d469834c4b9bdd4b7044a29315aed07866d003"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.295992 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.302431 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" event={"ID":"0b613d2c-46d4-461a-9414-d2ce3b1788bf","Type":"ContainerStarted","Data":"aa515064c5269cbfe12aef822a7cd645b59d236ad188742d6b364f88aa50ad05"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.304970 4774 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4btn7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.305023 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" podUID="959cd32e-9b14-4df3-aadf-d51b5a5a3c14" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.315746 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nt6rp" podStartSLOduration=130.315723162 podStartE2EDuration="2m10.315723162s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.275366148 +0000 UTC m=+150.581143052" watchObservedRunningTime="2026-01-27 00:09:32.315723162 +0000 UTC m=+150.621500046" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.315875 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" event={"ID":"05b20433-569d-4f1d-acd8-127119a934e1","Type":"ContainerStarted","Data":"e625750d0dcf84e9ce73986d0f5c3c2f42aee3261cf5c6b2cf7f5bc93a823f2f"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.347375 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.350105 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.850084991 +0000 UTC m=+151.155861875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.355708 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"65f41b703ac2a3495a5997805adbdeb5fea07a418c84c30891544bb7306d7cb9"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.374966 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7qfw" podStartSLOduration=130.374947861 podStartE2EDuration="2m10.374947861s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.314085491 +0000 UTC m=+150.619862375" watchObservedRunningTime="2026-01-27 00:09:32.374947861 +0000 UTC m=+150.680724745" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.394788 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" event={"ID":"35026702-de7c-4f5f-8714-f1d7f89adae6","Type":"ContainerStarted","Data":"9d3b54aefefce63346f1f00ddc4f46e09cd0df1268170f92b232f34fa16f2114"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.442417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pjtrg" event={"ID":"30b3f92f-3bf5-448c-9542-6217fb51f239","Type":"ContainerStarted","Data":"7529f2ff9ea726ce2e976d4f7124e5201c4cf044a54240f341fa33409a178046"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.456737 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ttgpc" podStartSLOduration=7.456709199 podStartE2EDuration="7.456709199s" podCreationTimestamp="2026-01-27 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.394275752 +0000 UTC m=+150.700052626" watchObservedRunningTime="2026-01-27 00:09:32.456709199 +0000 UTC m=+150.762486073" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.457151 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.458606 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:32.958564726 +0000 UTC m=+151.264341610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.459677 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-776sg" podStartSLOduration=130.459669369 podStartE2EDuration="2m10.459669369s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.45640106 +0000 UTC m=+150.762177964" watchObservedRunningTime="2026-01-27 00:09:32.459669369 +0000 UTC m=+150.765446253" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.473874 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" event={"ID":"f8bb8619-eac4-481e-bdb8-5fb5985c1844","Type":"ContainerStarted","Data":"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.473926 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.492496 4774 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-brc4j container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.492538 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.512730 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" event={"ID":"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72","Type":"ContainerStarted","Data":"68dbadf6774272234ad63109a60f8646d78e5ffd2b9aca24a0498e0100dbeb4f"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.583589 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" event={"ID":"7e83c76a-d44d-4a1b-b904-857dad56b5ba","Type":"ContainerStarted","Data":"8c92e747083941367d7da529da515147cb53c57429877dad5243cdb4b11c8b39"} Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.586760 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.591598 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxtrq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.591658 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.594202 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.595931 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.095912333 +0000 UTC m=+151.401689207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.604513 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9r6fm" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.624571 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" podStartSLOduration=130.624547638 podStartE2EDuration="2m10.624547638s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.509476542 +0000 UTC m=+150.815253446" watchObservedRunningTime="2026-01-27 00:09:32.624547638 +0000 UTC m=+150.930324522" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.627467 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pjtrg" podStartSLOduration=7.627453726 podStartE2EDuration="7.627453726s" podCreationTimestamp="2026-01-27 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.623578518 +0000 UTC m=+150.929355402" watchObservedRunningTime="2026-01-27 00:09:32.627453726 +0000 UTC m=+150.933230680" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.668302 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h7h7w" podStartSLOduration=130.668280684 podStartE2EDuration="2m10.668280684s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.666458628 +0000 UTC m=+150.972235502" watchObservedRunningTime="2026-01-27 00:09:32.668280684 +0000 UTC m=+150.974057568" Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.690048 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.692204 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.192176374 +0000 UTC m=+151.497953258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.793731 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.794607 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.294591544 +0000 UTC m=+151.600368428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.895338 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:32 crc kubenswrapper[4774]: E0127 00:09:32.895779 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.395745575 +0000 UTC m=+151.701522459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:32 crc kubenswrapper[4774]: I0127 00:09:32.963431 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" podStartSLOduration=130.963409822 podStartE2EDuration="2m10.963409822s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:32.963115043 +0000 UTC m=+151.268891917" watchObservedRunningTime="2026-01-27 00:09:32.963409822 +0000 UTC m=+151.269186716" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.002412 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.002786 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.502772775 +0000 UTC m=+151.808549659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.103649 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.104350 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.604333509 +0000 UTC m=+151.910110393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.176035 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podStartSLOduration=131.176016509 podStartE2EDuration="2m11.176016509s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.170046877 +0000 UTC m=+151.475823761" watchObservedRunningTime="2026-01-27 00:09:33.176016509 +0000 UTC m=+151.481793393" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.206723 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.207226 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.707210372 +0000 UTC m=+152.012987256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.238480 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:33 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:33 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:33 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.238901 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.254088 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ddxk7" podStartSLOduration=131.254069214 podStartE2EDuration="2m11.254069214s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.213227746 +0000 UTC m=+151.519004650" watchObservedRunningTime="2026-01-27 00:09:33.254069214 +0000 UTC m=+151.559846098" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.309634 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.310049 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.810024494 +0000 UTC m=+152.115801378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.411154 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.411514 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:33.911502684 +0000 UTC m=+152.217279558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.512983 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.513560 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.013534082 +0000 UTC m=+152.319310956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.594171 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" event={"ID":"4718399b-22db-4443-bc52-22e461891f11","Type":"ContainerStarted","Data":"90268d198bd720c510a581bccf515d612203e072791ede054c21f802099373fb"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.594230 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" event={"ID":"4718399b-22db-4443-bc52-22e461891f11","Type":"ContainerStarted","Data":"dcb8c6d9a33792d72682deba4d220be623f5732b9d06b099a9fae485f8604fc4"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.603142 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f5ce666c52b75c7420f9e32dbd1584b55dc2456393c63ed931418be0f3d1bf5"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.614588 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.615232 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.115201519 +0000 UTC m=+152.420978403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.616928 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" event={"ID":"9e947476-a4bc-441e-97ab-2caba294339b","Type":"ContainerStarted","Data":"e9fa4fe7cd9c2b10ed164729f43ce502eda4c35bfb6728f59fb5a61c815aa51e"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.619312 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" event={"ID":"697daa16-5938-4757-a941-83fa9dbb019b","Type":"ContainerStarted","Data":"ed16abebccd99073522a96ad212314cb4715dd7d871bd9b74043748576d1fab7"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.619376 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" event={"ID":"697daa16-5938-4757-a941-83fa9dbb019b","Type":"ContainerStarted","Data":"60fe548956b49516d76d1c1c5e73d5fe0ecd648fd4872217d7312b565adf7b07"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.619966 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.624642 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-97wdt" podStartSLOduration=131.624623196 podStartE2EDuration="2m11.624623196s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.620444829 +0000 UTC m=+151.926221723" watchObservedRunningTime="2026-01-27 00:09:33.624623196 +0000 UTC m=+151.930400090" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.637543 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" event={"ID":"35026702-de7c-4f5f-8714-f1d7f89adae6","Type":"ContainerStarted","Data":"fc01f558981d138381fa89374cb543543ad3bd9dfac543431e6e37bd66eea463"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.645116 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5c1d29767d919241d56d4e20a79f76e5963a04206a9f058c1bf0a40c5fe0a422"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.647525 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ttgpc" event={"ID":"39fc27d8-358f-40a9-8128-35b2e8f96f5c","Type":"ContainerStarted","Data":"2fa6c7280f1a5a01dcabed8cc50c68ab8c2ea3aac5ac4b8ba3892907e703e744"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.655975 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" event={"ID":"959cd32e-9b14-4df3-aadf-d51b5a5a3c14","Type":"ContainerStarted","Data":"7f8b2006d21a8d987794dea335f80e8f035a231409be346a564596ab6d540fad"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.680121 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" podStartSLOduration=131.680099012 podStartE2EDuration="2m11.680099012s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.678607116 +0000 UTC m=+151.984384000" watchObservedRunningTime="2026-01-27 00:09:33.680099012 +0000 UTC m=+151.985875896" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.685269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" event={"ID":"47e6217b-24f0-495d-8c8e-cb684083e8dd","Type":"ContainerStarted","Data":"eb004ebc60b978a3dd9c2e58364338608b0e5ef4f1dc32050853635d443330f0"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.685319 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" event={"ID":"47e6217b-24f0-495d-8c8e-cb684083e8dd","Type":"ContainerStarted","Data":"b4484f6c66dce909af934f3050c30df56cf9d2c8b41dd9d7b2888fc73b5a8781"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.715460 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.716172 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" event={"ID":"e6f70216-3d55-4dd8-81f7-f2129a277407","Type":"ContainerStarted","Data":"13cbfdfefde4b2b846017257f33cf257e031fe7b8750958529770b5666a5ed71"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.716942 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.718143 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.218122304 +0000 UTC m=+152.523899188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.736219 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" event={"ID":"5be86044-916e-44ba-9855-360b0ae2471a","Type":"ContainerStarted","Data":"22252fd42dd59fdcbd10887e36eee230da096419894251a5af55826bf1cd5b2f"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.763190 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cz4cq" podStartSLOduration=131.76317048 podStartE2EDuration="2m11.76317048s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.707362506 +0000 UTC m=+152.013139390" watchObservedRunningTime="2026-01-27 00:09:33.76317048 +0000 UTC m=+152.068947364" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.771732 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th968" event={"ID":"995e8bee-a3aa-466d-8007-4d12eab0d045","Type":"ContainerStarted","Data":"6cc72cf50a01a0eeaef3c15073d8754459c6aa3753a458b5640d227cceb423c7"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.772576 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-th968" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.811630 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" event={"ID":"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72","Type":"ContainerStarted","Data":"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.813159 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxtrq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.813197 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.818629 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.822428 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.322414671 +0000 UTC m=+152.628191555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.833992 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r8zvt" podStartSLOduration=131.833979284 podStartE2EDuration="2m11.833979284s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.764624824 +0000 UTC m=+152.070401708" watchObservedRunningTime="2026-01-27 00:09:33.833979284 +0000 UTC m=+152.139756168" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.834771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" event={"ID":"f6821801-4d19-46b9-8a38-e55810cb2dbf","Type":"ContainerStarted","Data":"bc04a1b5d1f6a631f21b2e03f9ae6fc19d4a13350704db08c3fa6f680e42a44c"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.834806 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" event={"ID":"f6821801-4d19-46b9-8a38-e55810cb2dbf","Type":"ContainerStarted","Data":"4bd00e60402df038d82c41fef3a8f8ff6a9aa8f7ae2a65dfb5b950354d30c142"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.861210 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" event={"ID":"7bfd245d-eb59-40d5-b1cf-517beaa46f32","Type":"ContainerStarted","Data":"170df912a9c317355402d62276e4ed8b047cd56d4f5ceccef1caf073d1c7428e"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.861269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" event={"ID":"7bfd245d-eb59-40d5-b1cf-517beaa46f32","Type":"ContainerStarted","Data":"fe62ede7836f2766358f719f369947eea925d509f34f02a16d127d5c988f4ee3"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.862391 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.867724 4774 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bgm7n container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.867790 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" podUID="7bfd245d-eb59-40d5-b1cf-517beaa46f32" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.881022 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" event={"ID":"31004ea4-e1fa-489e-a44e-701f370c9899","Type":"ContainerStarted","Data":"5dd632505274246e91254dddec44b325983f5f143303d7eda0d2ca677d02b528"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.881176 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" event={"ID":"31004ea4-e1fa-489e-a44e-701f370c9899","Type":"ContainerStarted","Data":"075a631581eace9df7d190813bded1f6bd2c5e7d46d625e9fb54d10fff672a8f"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.898838 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5be0f25dbcdf09c743f9fb7856b6a292a71f820b8d2df02e3668d6d9308d0b35"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.899002 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2002ab4f0da77498d61b6a634d8112f92a113453f276047f008cac4e6c357042"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.899572 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.922585 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.923119 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.423086577 +0000 UTC m=+152.728863461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.925789 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8z5sz" podStartSLOduration=131.925772108 podStartE2EDuration="2m11.925772108s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.92350836 +0000 UTC m=+152.229285244" watchObservedRunningTime="2026-01-27 00:09:33.925772108 +0000 UTC m=+152.231548992" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.927163 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" event={"ID":"0b613d2c-46d4-461a-9414-d2ce3b1788bf","Type":"ContainerStarted","Data":"26a6e1695fd4493c96ea71a7b30ed1405b2549d6759d42ff50698cae6462af89"} Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.928634 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.928754 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.935298 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:33 crc kubenswrapper[4774]: E0127 00:09:33.937476 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.437459916 +0000 UTC m=+152.743236800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.948225 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4txdd" Jan 27 00:09:33 crc kubenswrapper[4774]: I0127 00:09:33.983202 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8kqjp" podStartSLOduration=131.983181814 podStartE2EDuration="2m11.983181814s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:33.959368935 +0000 UTC m=+152.265145819" watchObservedRunningTime="2026-01-27 00:09:33.983181814 +0000 UTC m=+152.288958698" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.043521 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.045522 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.545499467 +0000 UTC m=+152.851276351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.146682 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.147120 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.647104573 +0000 UTC m=+152.952881457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.147921 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" podStartSLOduration=132.147909107 podStartE2EDuration="2m12.147909107s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.077266148 +0000 UTC m=+152.383043032" watchObservedRunningTime="2026-01-27 00:09:34.147909107 +0000 UTC m=+152.453685981" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.152289 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.154395 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dhlhj" podStartSLOduration=132.154382075 podStartE2EDuration="2m12.154382075s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.148305718 +0000 UTC m=+152.454082622" watchObservedRunningTime="2026-01-27 00:09:34.154382075 +0000 UTC m=+152.460158959" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.214881 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-th968" podStartSLOduration=9.214845322 podStartE2EDuration="9.214845322s" podCreationTimestamp="2026-01-27 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.213550103 +0000 UTC m=+152.519327007" watchObservedRunningTime="2026-01-27 00:09:34.214845322 +0000 UTC m=+152.520622206" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.245170 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:34 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:34 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:34 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.245666 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.247671 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.248131 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.748112739 +0000 UTC m=+153.053889623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.283688 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" podStartSLOduration=132.283670085 podStartE2EDuration="2m12.283670085s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.281193029 +0000 UTC m=+152.586969913" watchObservedRunningTime="2026-01-27 00:09:34.283670085 +0000 UTC m=+152.589446969" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.321690 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gf7zx" podStartSLOduration=132.321666586 podStartE2EDuration="2m12.321666586s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.320985256 +0000 UTC m=+152.626762150" watchObservedRunningTime="2026-01-27 00:09:34.321666586 +0000 UTC m=+152.627443470" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.350334 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.350648 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.850636831 +0000 UTC m=+153.156413715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.384405 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" podStartSLOduration=132.384384093 podStartE2EDuration="2m12.384384093s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.350080185 +0000 UTC m=+152.655857069" watchObservedRunningTime="2026-01-27 00:09:34.384384093 +0000 UTC m=+152.690160977" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.451088 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.451325 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.951307428 +0000 UTC m=+153.257084312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.451373 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.451739 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:34.95173123 +0000 UTC m=+153.257508114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.553058 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.553552 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.053520161 +0000 UTC m=+153.359297055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.654768 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.655272 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.155248859 +0000 UTC m=+153.461025943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.658116 4774 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4btn7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.658166 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" podUID="959cd32e-9b14-4df3-aadf-d51b5a5a3c14" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.756173 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.756279 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.256262366 +0000 UTC m=+153.562039250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.756500 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.757053 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.257025579 +0000 UTC m=+153.562802643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.857977 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.858180 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.358138939 +0000 UTC m=+153.663915823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.858506 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.858936 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.358917553 +0000 UTC m=+153.664694437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.933920 4774 generic.go:334] "Generic (PLEG): container finished" podID="0b613d2c-46d4-461a-9414-d2ce3b1788bf" containerID="26a6e1695fd4493c96ea71a7b30ed1405b2549d6759d42ff50698cae6462af89" exitCode=0 Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.934017 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" event={"ID":"0b613d2c-46d4-461a-9414-d2ce3b1788bf","Type":"ContainerDied","Data":"26a6e1695fd4493c96ea71a7b30ed1405b2549d6759d42ff50698cae6462af89"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.937148 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"10e382312a24a32bbcfe09efa9541435c12ed84634265e44fa95274fe3b603f3"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.937178 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"aa29ac077703e9abe30d41513138058abefb3f449b063ad06344e09042cff927"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.937192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" event={"ID":"16b7780a-d64f-4a63-b462-c924d7c44aac","Type":"ContainerStarted","Data":"ae0dd065d36d2ce9b1b8d1854ce10a779cd352216693596a92093376597a2f6a"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.940065 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-th968" event={"ID":"995e8bee-a3aa-466d-8007-4d12eab0d045","Type":"ContainerStarted","Data":"0e06505747eee92ced1229f30d7d6a8cb57e1f02736feb715aa5c87b9ef545e9"} Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.946729 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.946772 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.946956 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxtrq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.947022 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.950584 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bgm7n" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.959261 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:34 crc kubenswrapper[4774]: E0127 00:09:34.959628 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.459593079 +0000 UTC m=+153.765369963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.976098 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4btn7" Jan 27 00:09:34 crc kubenswrapper[4774]: I0127 00:09:34.982049 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-m59xl" podStartSLOduration=9.982029345 podStartE2EDuration="9.982029345s" podCreationTimestamp="2026-01-27 00:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:34.979948531 +0000 UTC m=+153.285725415" watchObservedRunningTime="2026-01-27 00:09:34.982029345 +0000 UTC m=+153.287806219" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.063339 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.066957 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.56693525 +0000 UTC m=+153.872712134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.163610 4774 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.164657 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.168945 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.668917375 +0000 UTC m=+153.974694259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.175459 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.176647 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.676628791 +0000 UTC m=+153.982405675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.223341 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:35 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:35 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:35 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.223664 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.276463 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.277133 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.777103092 +0000 UTC m=+154.082879976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.277294 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.277666 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.777651128 +0000 UTC m=+154.083428012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.378583 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.378756 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.878729216 +0000 UTC m=+154.184506100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.379089 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.379424 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.879411178 +0000 UTC m=+154.185188062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.479942 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.480193 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.980154046 +0000 UTC m=+154.285930940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.480667 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: E0127 00:09:35.481170 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 00:09:35.981148706 +0000 UTC m=+154.286925590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zgbtz" (UID: "df626623-28b8-43a3-a567-f14b1e95075a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.505313 4774 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-27T00:09:35.163646525Z","Handler":null,"Name":""} Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.512303 4774 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.512351 4774 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.581311 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.585305 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.682357 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.686028 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.686084 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.724512 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zgbtz\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:35 crc kubenswrapper[4774]: I0127 00:09:35.992219 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.222529 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.224145 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.226297 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:36 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:36 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:36 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.226341 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.228399 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.229665 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.289097 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.290618 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.293644 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.296103 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.369598 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390691 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390780 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390816 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390836 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390892 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.390916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.424090 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.436912 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:09:36 crc kubenswrapper[4774]: W0127 00:09:36.447992 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf626623_28b8_43a3_a567_f14b1e95075a.slice/crio-8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf WatchSource:0}: Error finding container 8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf: Status 404 returned error can't find the container with id 8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492478 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492553 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492577 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492595 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492652 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.492676 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493114 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493181 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493214 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: E0127 00:09:36.493466 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b613d2c-46d4-461a-9414-d2ce3b1788bf" containerName="collect-profiles" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493767 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493890 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b613d2c-46d4-461a-9414-d2ce3b1788bf" containerName="collect-profiles" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.493977 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.494035 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b613d2c-46d4-461a-9414-d2ce3b1788bf" containerName="collect-profiles" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.494967 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.541912 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.545360 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") pod \"community-operators-lnlbr\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.548773 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.556550 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") pod \"certified-operators-pj24q\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.594256 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") pod \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.597623 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") pod \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.597705 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") pod \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\" (UID: \"0b613d2c-46d4-461a-9414-d2ce3b1788bf\") " Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.599369 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b613d2c-46d4-461a-9414-d2ce3b1788bf" (UID: "0b613d2c-46d4-461a-9414-d2ce3b1788bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.600326 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b613d2c-46d4-461a-9414-d2ce3b1788bf-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.602664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b613d2c-46d4-461a-9414-d2ce3b1788bf" (UID: "0b613d2c-46d4-461a-9414-d2ce3b1788bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.603373 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p" (OuterVolumeSpecName: "kube-api-access-rp85p") pod "0b613d2c-46d4-461a-9414-d2ce3b1788bf" (UID: "0b613d2c-46d4-461a-9414-d2ce3b1788bf"). InnerVolumeSpecName "kube-api-access-rp85p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.643319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.675908 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.675984 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.687791 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.689163 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.697076 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.697283 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702270 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702312 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702356 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702464 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702519 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702593 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp85p\" (UniqueName: \"kubernetes.io/projected/0b613d2c-46d4-461a-9414-d2ce3b1788bf-kube-api-access-rp85p\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.702614 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b613d2c-46d4-461a-9414-d2ce3b1788bf-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.706691 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.723378 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.802335 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803309 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803358 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803399 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803430 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803455 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.803481 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.804498 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.804607 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.806741 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.807920 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.823726 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") pod \"certified-operators-kb8hq\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.824394 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") pod \"community-operators-ngr5v\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.828538 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:09:36 crc kubenswrapper[4774]: W0127 00:09:36.855656 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cad24dd_acc4_40e5_8380_e0d74be79921.slice/crio-9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1 WatchSource:0}: Error finding container 9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1: Status 404 returned error can't find the container with id 9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1 Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.956266 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerStarted","Data":"9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1"} Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.960519 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" event={"ID":"0b613d2c-46d4-461a-9414-d2ce3b1788bf","Type":"ContainerDied","Data":"aa515064c5269cbfe12aef822a7cd645b59d236ad188742d6b364f88aa50ad05"} Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.960567 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa515064c5269cbfe12aef822a7cd645b59d236ad188742d6b364f88aa50ad05" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.960734 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491200-fxl4m" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.965039 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.972035 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" event={"ID":"df626623-28b8-43a3-a567-f14b1e95075a","Type":"ContainerStarted","Data":"3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387"} Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.972111 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" event={"ID":"df626623-28b8-43a3-a567-f14b1e95075a","Type":"ContainerStarted","Data":"8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf"} Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.974214 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.979746 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-j4bbk" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.986802 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.987789 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.995596 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 27 00:09:36 crc kubenswrapper[4774]: I0127 00:09:36.995911 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:36.998647 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" podStartSLOduration=134.998612375 podStartE2EDuration="2m14.998612375s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:09:36.996232401 +0000 UTC m=+155.302009305" watchObservedRunningTime="2026-01-27 00:09:36.998612375 +0000 UTC m=+155.304389279" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.019293 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.059270 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.109752 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.110083 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.115653 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.122768 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vl2l2" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.211922 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.212429 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.212738 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.226138 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:37 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:37 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:37 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.226212 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.238113 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.315608 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.330339 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:09:37 crc kubenswrapper[4774]: W0127 00:09:37.374172 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f2c4e4_f111_453e_b953_cdf2528f9a3e.slice/crio-fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76 WatchSource:0}: Error finding container fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76: Status 404 returned error can't find the container with id fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.539644 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.976811 4774 generic.go:334] "Generic (PLEG): container finished" podID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerID="063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89" exitCode=0 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.976888 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerDied","Data":"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.979113 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.979751 4774 generic.go:334] "Generic (PLEG): container finished" podID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerID="5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202" exitCode=0 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.979833 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerDied","Data":"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.979932 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerStarted","Data":"fafa96ca772c719431784a78d0a35959db893c145e408b85c8cbdfa6a0078917"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.982071 4774 generic.go:334] "Generic (PLEG): container finished" podID="af002227-deb6-4a24-8ce5-051f93bc178b" containerID="817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3" exitCode=0 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.982163 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerDied","Data":"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.982197 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerStarted","Data":"f9e3168af24b8a48492cc6713c61bfd5396824e8a18842b10c09f8a37f4eed28"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.984187 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d69e5e66-2cd3-4102-9126-0577e4442e7b","Type":"ContainerStarted","Data":"ef257a72263bb152bc90b02e175f4869660c82a437b852f0f0c82132e335ab9f"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.985887 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerID="32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf" exitCode=0 Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.985975 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerDied","Data":"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf"} Jan 27 00:09:37 crc kubenswrapper[4774]: I0127 00:09:37.986017 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerStarted","Data":"fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76"} Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.121657 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.122101 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.121829 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-nt6rp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.122459 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nt6rp" podUID="475a4aef-33e7-40ad-b7c4-20a10efa4ec3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.171293 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.171349 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.174009 4774 patch_prober.go:28] interesting pod/console-f9d7485db-776sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.174084 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-776sg" podUID="193a08ea-86ea-4176-898e-6a2c476ea6e9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.219093 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.225063 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:38 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:38 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:38 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.225146 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.287861 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.289075 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.291528 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.299844 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.432354 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.432416 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.432451 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.533865 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.534011 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.534033 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.534460 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.534560 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.554433 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") pod \"redhat-marketplace-ds9fx\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.611468 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.687846 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.689225 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.697699 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.839057 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.839467 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.839542 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.899946 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.918001 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.940539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.941110 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.941142 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.941219 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.941487 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.963409 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") pod \"redhat-marketplace-tlk6f\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:38 crc kubenswrapper[4774]: I0127 00:09:38.993531 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerStarted","Data":"7c895fbfa4a8556de4f63281aed06489f2f81ec44684ed968eb3615baa9f463f"} Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.010302 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d69e5e66-2cd3-4102-9126-0577e4442e7b","Type":"ContainerDied","Data":"5df84b7c9c635777323921eed7ce0bdc1ff6c29233389784fd92a34993ae9156"} Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.010260 4774 generic.go:334] "Generic (PLEG): container finished" podID="d69e5e66-2cd3-4102-9126-0577e4442e7b" containerID="5df84b7c9c635777323921eed7ce0bdc1ff6c29233389784fd92a34993ae9156" exitCode=0 Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.035372 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.227523 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:39 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:39 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:39 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.227951 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.289009 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.290074 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.292996 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.297284 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.357329 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.357378 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.357442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.459116 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.459639 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.459668 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.460172 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.460386 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.481061 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") pod \"redhat-operators-97k6k\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.558084 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.652039 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.688109 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.689191 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.701183 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.764547 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.764604 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.764735 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.867232 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.867986 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.868022 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.870878 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.871982 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:39 crc kubenswrapper[4774]: I0127 00:09:39.883127 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") pod \"redhat-operators-mch4n\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.020850 4774 generic.go:334] "Generic (PLEG): container finished" podID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerID="59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532" exitCode=0 Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.020973 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerDied","Data":"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532"} Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.024599 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.025324 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerID="7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731" exitCode=0 Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.025571 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerDied","Data":"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731"} Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.025601 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerStarted","Data":"ce3bedf60ecff127b9cf54aca7f49498f367bae4df2ca706885feb215a762b62"} Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.192040 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.226333 4774 patch_prober.go:28] interesting pod/router-default-5444994796-44dpr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 00:09:40 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Jan 27 00:09:40 crc kubenswrapper[4774]: [+]process-running ok Jan 27 00:09:40 crc kubenswrapper[4774]: healthz check failed Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.226777 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44dpr" podUID="607f973b-fc78-4c11-bc09-fdbbe414d8c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.330360 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.331508 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.335814 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.337216 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.338867 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.379912 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.389851 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: W0127 00:09:40.421658 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e47ec75_abd8_41d4_a3c8_e722d21a305f.slice/crio-1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c WatchSource:0}: Error finding container 1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c: Status 404 returned error can't find the container with id 1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.475882 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") pod \"d69e5e66-2cd3-4102-9126-0577e4442e7b\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.475921 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") pod \"d69e5e66-2cd3-4102-9126-0577e4442e7b\" (UID: \"d69e5e66-2cd3-4102-9126-0577e4442e7b\") " Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.476229 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.476269 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.477289 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d69e5e66-2cd3-4102-9126-0577e4442e7b" (UID: "d69e5e66-2cd3-4102-9126-0577e4442e7b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.482544 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d69e5e66-2cd3-4102-9126-0577e4442e7b" (UID: "d69e5e66-2cd3-4102-9126-0577e4442e7b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.576958 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.577101 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.577100 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.577542 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d69e5e66-2cd3-4102-9126-0577e4442e7b-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.577555 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d69e5e66-2cd3-4102-9126-0577e4442e7b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.597375 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:40 crc kubenswrapper[4774]: I0127 00:09:40.684723 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.061351 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d69e5e66-2cd3-4102-9126-0577e4442e7b","Type":"ContainerDied","Data":"ef257a72263bb152bc90b02e175f4869660c82a437b852f0f0c82132e335ab9f"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.061392 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef257a72263bb152bc90b02e175f4869660c82a437b852f0f0c82132e335ab9f" Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.061368 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.064874 4774 generic.go:334] "Generic (PLEG): container finished" podID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerID="ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d" exitCode=0 Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.064966 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerDied","Data":"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.065000 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerStarted","Data":"1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.069061 4774 generic.go:334] "Generic (PLEG): container finished" podID="386f196b-c4bc-4fea-924b-c0487a352310" containerID="783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9" exitCode=0 Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.069117 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerDied","Data":"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.069146 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerStarted","Data":"8a4f7dc871614277fda5710d46e4863b99158a3773f029cd73a7cc991db5ffae"} Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.204665 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 00:09:41 crc kubenswrapper[4774]: W0127 00:09:41.217935 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcd05a6f8_6bd6_40f4_8a17_79fccebb7103.slice/crio-5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af WatchSource:0}: Error finding container 5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af: Status 404 returned error can't find the container with id 5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.223591 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:41 crc kubenswrapper[4774]: I0127 00:09:41.227686 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-44dpr" Jan 27 00:09:42 crc kubenswrapper[4774]: I0127 00:09:42.082283 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cd05a6f8-6bd6-40f4-8a17-79fccebb7103","Type":"ContainerStarted","Data":"5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af"} Jan 27 00:09:43 crc kubenswrapper[4774]: I0127 00:09:43.092142 4774 generic.go:334] "Generic (PLEG): container finished" podID="cd05a6f8-6bd6-40f4-8a17-79fccebb7103" containerID="2094c034871a99cde3fa9218cb44ce39e3caf7cba67f764090a2ef81441ea7ec" exitCode=0 Jan 27 00:09:43 crc kubenswrapper[4774]: I0127 00:09:43.092256 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cd05a6f8-6bd6-40f4-8a17-79fccebb7103","Type":"ContainerDied","Data":"2094c034871a99cde3fa9218cb44ce39e3caf7cba67f764090a2ef81441ea7ec"} Jan 27 00:09:43 crc kubenswrapper[4774]: I0127 00:09:43.785409 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-th968" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.319297 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.463110 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") pod \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.463208 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") pod \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\" (UID: \"cd05a6f8-6bd6-40f4-8a17-79fccebb7103\") " Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.464380 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cd05a6f8-6bd6-40f4-8a17-79fccebb7103" (UID: "cd05a6f8-6bd6-40f4-8a17-79fccebb7103"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.469049 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cd05a6f8-6bd6-40f4-8a17-79fccebb7103" (UID: "cd05a6f8-6bd6-40f4-8a17-79fccebb7103"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.564840 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:44 crc kubenswrapper[4774]: I0127 00:09:44.565388 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd05a6f8-6bd6-40f4-8a17-79fccebb7103-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:09:45 crc kubenswrapper[4774]: I0127 00:09:45.121897 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cd05a6f8-6bd6-40f4-8a17-79fccebb7103","Type":"ContainerDied","Data":"5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af"} Jan 27 00:09:45 crc kubenswrapper[4774]: I0127 00:09:45.121954 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e0f15f81a552e5a6f966000dc9bf9395c95f317fda446bd5fbe90cdc233f5af" Jan 27 00:09:45 crc kubenswrapper[4774]: I0127 00:09:45.122039 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 00:09:46 crc kubenswrapper[4774]: I0127 00:09:46.395057 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:46 crc kubenswrapper[4774]: I0127 00:09:46.403628 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e639e1da-0d65-4d42-b1fc-23d5db91e9e6-metrics-certs\") pod \"network-metrics-daemon-6djzf\" (UID: \"e639e1da-0d65-4d42-b1fc-23d5db91e9e6\") " pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:46 crc kubenswrapper[4774]: I0127 00:09:46.683439 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6djzf" Jan 27 00:09:48 crc kubenswrapper[4774]: I0127 00:09:48.131081 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nt6rp" Jan 27 00:09:48 crc kubenswrapper[4774]: I0127 00:09:48.171692 4774 patch_prober.go:28] interesting pod/console-f9d7485db-776sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Jan 27 00:09:48 crc kubenswrapper[4774]: I0127 00:09:48.171747 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-776sg" podUID="193a08ea-86ea-4176-898e-6a2c476ea6e9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.28:8443/health\": dial tcp 10.217.0.28:8443: connect: connection refused" Jan 27 00:09:56 crc kubenswrapper[4774]: I0127 00:09:56.002392 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:09:58 crc kubenswrapper[4774]: I0127 00:09:58.176038 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:09:58 crc kubenswrapper[4774]: I0127 00:09:58.179851 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-776sg" Jan 27 00:10:03 crc kubenswrapper[4774]: I0127 00:10:03.260985 4774 generic.go:334] "Generic (PLEG): container finished" podID="518a161e-aeab-4ad6-a2c0-dee7ec963958" containerID="0291aad17e7875c8c8bae7dffadc2698234bc0f4a826218ffa06fe13e248194a" exitCode=0 Jan 27 00:10:03 crc kubenswrapper[4774]: I0127 00:10:03.261051 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29491200-4blsg" event={"ID":"518a161e-aeab-4ad6-a2c0-dee7ec963958","Type":"ContainerDied","Data":"0291aad17e7875c8c8bae7dffadc2698234bc0f4a826218ffa06fe13e248194a"} Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.084108 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage120616833/1\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.084555 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w8rvm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mch4n_openshift-marketplace(6e47ec75-abd8-41d4-a3c8-e722d21a305f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage120616833/1\": happened during read: context canceled" logger="UnhandledError" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.085736 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage120616833/1\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-mch4n" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.116304 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.116466 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57w5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kb8hq_openshift-marketplace(f4f2c4e4-f111-453e-b953-cdf2528f9a3e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.118066 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kb8hq" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.176128 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.176725 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nvt8p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ngr5v_openshift-marketplace(5e621715-39b3-4094-b589-ede8b74b0e8f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.177910 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ngr5v" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.224248 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.224406 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g24zl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tlk6f_openshift-marketplace(b5f7850c-533c-48e5-bead-ddc0e7ba8d83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:10:05 crc kubenswrapper[4774]: E0127 00:10:05.225832 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tlk6f" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" Jan 27 00:10:06 crc kubenswrapper[4774]: I0127 00:10:06.675893 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:10:06 crc kubenswrapper[4774]: I0127 00:10:06.676429 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.833485 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ngr5v" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.834322 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kb8hq" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.834397 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mch4n" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.834481 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tlk6f" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" Jan 27 00:10:08 crc kubenswrapper[4774]: I0127 00:10:08.879197 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-49cqc" Jan 27 00:10:08 crc kubenswrapper[4774]: I0127 00:10:08.932171 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.972263 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.972436 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vk44x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-97k6k_openshift-marketplace(386f196b-c4bc-4fea-924b-c0487a352310): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:10:08 crc kubenswrapper[4774]: E0127 00:10:08.973613 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-97k6k" podUID="386f196b-c4bc-4fea-924b-c0487a352310" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.035036 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") pod \"518a161e-aeab-4ad6-a2c0-dee7ec963958\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.035608 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") pod \"518a161e-aeab-4ad6-a2c0-dee7ec963958\" (UID: \"518a161e-aeab-4ad6-a2c0-dee7ec963958\") " Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.036788 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca" (OuterVolumeSpecName: "serviceca") pod "518a161e-aeab-4ad6-a2c0-dee7ec963958" (UID: "518a161e-aeab-4ad6-a2c0-dee7ec963958"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.046013 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr" (OuterVolumeSpecName: "kube-api-access-8kbkr") pod "518a161e-aeab-4ad6-a2c0-dee7ec963958" (UID: "518a161e-aeab-4ad6-a2c0-dee7ec963958"). InnerVolumeSpecName "kube-api-access-8kbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.137101 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kbkr\" (UniqueName: \"kubernetes.io/projected/518a161e-aeab-4ad6-a2c0-dee7ec963958-kube-api-access-8kbkr\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.137164 4774 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/518a161e-aeab-4ad6-a2c0-dee7ec963958-serviceca\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.296405 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerStarted","Data":"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0"} Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.302505 4774 generic.go:334] "Generic (PLEG): container finished" podID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerID="b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724" exitCode=0 Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.302652 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerDied","Data":"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724"} Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.306820 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29491200-4blsg" event={"ID":"518a161e-aeab-4ad6-a2c0-dee7ec963958","Type":"ContainerDied","Data":"b36612e805e28ae8902021d1d0300be7876973ace2dc30015657a43e1a1a97ef"} Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.306868 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b36612e805e28ae8902021d1d0300be7876973ace2dc30015657a43e1a1a97ef" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.306932 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29491200-4blsg" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.318363 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerStarted","Data":"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f"} Jan 27 00:10:09 crc kubenswrapper[4774]: E0127 00:10:09.319056 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-97k6k" podUID="386f196b-c4bc-4fea-924b-c0487a352310" Jan 27 00:10:09 crc kubenswrapper[4774]: I0127 00:10:09.358288 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6djzf"] Jan 27 00:10:09 crc kubenswrapper[4774]: W0127 00:10:09.468250 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode639e1da_0d65_4d42_b1fc_23d5db91e9e6.slice/crio-2dc19f95b24a889fa65ef843dbc6201539cabe87f4a95933fe0c63dbf910b530 WatchSource:0}: Error finding container 2dc19f95b24a889fa65ef843dbc6201539cabe87f4a95933fe0c63dbf910b530: Status 404 returned error can't find the container with id 2dc19f95b24a889fa65ef843dbc6201539cabe87f4a95933fe0c63dbf910b530 Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.326618 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerStarted","Data":"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.329030 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6djzf" event={"ID":"e639e1da-0d65-4d42-b1fc-23d5db91e9e6","Type":"ContainerStarted","Data":"847ee6772016091d4a0142e3d9b733102020545c4889f9bf358fc28dfe1d709a"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.329068 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6djzf" event={"ID":"e639e1da-0d65-4d42-b1fc-23d5db91e9e6","Type":"ContainerStarted","Data":"27e7305723b8b46d31d00cba48862e9225f2fbda4ec93f9fff46d97c268a5ffb"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.329111 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6djzf" event={"ID":"e639e1da-0d65-4d42-b1fc-23d5db91e9e6","Type":"ContainerStarted","Data":"2dc19f95b24a889fa65ef843dbc6201539cabe87f4a95933fe0c63dbf910b530"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.332052 4774 generic.go:334] "Generic (PLEG): container finished" podID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerID="20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f" exitCode=0 Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.332129 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerDied","Data":"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.338454 4774 generic.go:334] "Generic (PLEG): container finished" podID="af002227-deb6-4a24-8ce5-051f93bc178b" containerID="b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0" exitCode=0 Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.338532 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerDied","Data":"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0"} Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.348909 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ds9fx" podStartSLOduration=2.336960469 podStartE2EDuration="32.348876189s" podCreationTimestamp="2026-01-27 00:09:38 +0000 UTC" firstStartedPulling="2026-01-27 00:09:40.023408462 +0000 UTC m=+158.329185336" lastFinishedPulling="2026-01-27 00:10:10.035324172 +0000 UTC m=+188.341101056" observedRunningTime="2026-01-27 00:10:10.346630689 +0000 UTC m=+188.652407583" watchObservedRunningTime="2026-01-27 00:10:10.348876189 +0000 UTC m=+188.654653083" Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.405938 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6djzf" podStartSLOduration=168.405912977 podStartE2EDuration="2m48.405912977s" podCreationTimestamp="2026-01-27 00:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:10:10.404229642 +0000 UTC m=+188.710006526" watchObservedRunningTime="2026-01-27 00:10:10.405912977 +0000 UTC m=+188.711689861" Jan 27 00:10:10 crc kubenswrapper[4774]: I0127 00:10:10.693184 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 00:10:11 crc kubenswrapper[4774]: I0127 00:10:11.346251 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerStarted","Data":"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e"} Jan 27 00:10:11 crc kubenswrapper[4774]: I0127 00:10:11.349979 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerStarted","Data":"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4"} Jan 27 00:10:11 crc kubenswrapper[4774]: I0127 00:10:11.379135 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnlbr" podStartSLOduration=2.473852607 podStartE2EDuration="35.379107725s" podCreationTimestamp="2026-01-27 00:09:36 +0000 UTC" firstStartedPulling="2026-01-27 00:09:37.978904049 +0000 UTC m=+156.284680933" lastFinishedPulling="2026-01-27 00:10:10.884159167 +0000 UTC m=+189.189936051" observedRunningTime="2026-01-27 00:10:11.374972054 +0000 UTC m=+189.680748948" watchObservedRunningTime="2026-01-27 00:10:11.379107725 +0000 UTC m=+189.684884609" Jan 27 00:10:11 crc kubenswrapper[4774]: I0127 00:10:11.434891 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pj24q" podStartSLOduration=2.610337687 podStartE2EDuration="35.434853259s" podCreationTimestamp="2026-01-27 00:09:36 +0000 UTC" firstStartedPulling="2026-01-27 00:09:37.983526191 +0000 UTC m=+156.289303075" lastFinishedPulling="2026-01-27 00:10:10.808041753 +0000 UTC m=+189.113818647" observedRunningTime="2026-01-27 00:10:11.430848241 +0000 UTC m=+189.736625125" watchObservedRunningTime="2026-01-27 00:10:11.434853259 +0000 UTC m=+189.740630143" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.718969 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 00:10:14 crc kubenswrapper[4774]: E0127 00:10:14.721574 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69e5e66-2cd3-4102-9126-0577e4442e7b" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.721752 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69e5e66-2cd3-4102-9126-0577e4442e7b" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: E0127 00:10:14.721901 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd05a6f8-6bd6-40f4-8a17-79fccebb7103" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.721984 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd05a6f8-6bd6-40f4-8a17-79fccebb7103" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: E0127 00:10:14.722111 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518a161e-aeab-4ad6-a2c0-dee7ec963958" containerName="image-pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.722190 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="518a161e-aeab-4ad6-a2c0-dee7ec963958" containerName="image-pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.722400 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="518a161e-aeab-4ad6-a2c0-dee7ec963958" containerName="image-pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.722597 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd05a6f8-6bd6-40f4-8a17-79fccebb7103" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.722721 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69e5e66-2cd3-4102-9126-0577e4442e7b" containerName="pruner" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.724047 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.727488 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.728124 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.728505 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.824942 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.825075 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.926166 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.926268 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.926673 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:14 crc kubenswrapper[4774]: I0127 00:10:14.955439 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:15 crc kubenswrapper[4774]: I0127 00:10:15.061207 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:15 crc kubenswrapper[4774]: I0127 00:10:15.492222 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 00:10:15 crc kubenswrapper[4774]: W0127 00:10:15.501735 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod15eaf928_73fe_4143_8de9_77c76dc4a990.slice/crio-e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a WatchSource:0}: Error finding container e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a: Status 404 returned error can't find the container with id e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.393634 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15eaf928-73fe-4143-8de9-77c76dc4a990","Type":"ContainerStarted","Data":"f0aa1f85fd14124d38f2ae424a2c19c56e444540fb77f1c91838b8daf417fe57"} Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.394283 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15eaf928-73fe-4143-8de9-77c76dc4a990","Type":"ContainerStarted","Data":"e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a"} Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.414339 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.414318082 podStartE2EDuration="2.414318082s" podCreationTimestamp="2026-01-27 00:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:10:16.41015206 +0000 UTC m=+194.715928944" watchObservedRunningTime="2026-01-27 00:10:16.414318082 +0000 UTC m=+194.720094966" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.523724 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.549153 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.549229 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.644699 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.647413 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.882911 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:10:16 crc kubenswrapper[4774]: I0127 00:10:16.883563 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:10:17 crc kubenswrapper[4774]: I0127 00:10:17.400154 4774 generic.go:334] "Generic (PLEG): container finished" podID="15eaf928-73fe-4143-8de9-77c76dc4a990" containerID="f0aa1f85fd14124d38f2ae424a2c19c56e444540fb77f1c91838b8daf417fe57" exitCode=0 Jan 27 00:10:17 crc kubenswrapper[4774]: I0127 00:10:17.400208 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15eaf928-73fe-4143-8de9-77c76dc4a990","Type":"ContainerDied","Data":"f0aa1f85fd14124d38f2ae424a2c19c56e444540fb77f1c91838b8daf417fe57"} Jan 27 00:10:17 crc kubenswrapper[4774]: I0127 00:10:17.457360 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:10:17 crc kubenswrapper[4774]: I0127 00:10:17.458200 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.614422 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.615302 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.680928 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.760129 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.879563 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") pod \"15eaf928-73fe-4143-8de9-77c76dc4a990\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.879768 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") pod \"15eaf928-73fe-4143-8de9-77c76dc4a990\" (UID: \"15eaf928-73fe-4143-8de9-77c76dc4a990\") " Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.879848 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "15eaf928-73fe-4143-8de9-77c76dc4a990" (UID: "15eaf928-73fe-4143-8de9-77c76dc4a990"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.880081 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15eaf928-73fe-4143-8de9-77c76dc4a990-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.888497 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "15eaf928-73fe-4143-8de9-77c76dc4a990" (UID: "15eaf928-73fe-4143-8de9-77c76dc4a990"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:18 crc kubenswrapper[4774]: I0127 00:10:18.982189 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15eaf928-73fe-4143-8de9-77c76dc4a990-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.412592 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"15eaf928-73fe-4143-8de9-77c76dc4a990","Type":"ContainerDied","Data":"e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a"} Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.413110 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8fcdc2d03704b966cd0dbe8bdf5640b69380bf6aa553dc2077fdc28c7bf4d1a" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.412878 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.476882 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.525781 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 00:10:19 crc kubenswrapper[4774]: E0127 00:10:19.526133 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15eaf928-73fe-4143-8de9-77c76dc4a990" containerName="pruner" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.526166 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="15eaf928-73fe-4143-8de9-77c76dc4a990" containerName="pruner" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.526280 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="15eaf928-73fe-4143-8de9-77c76dc4a990" containerName="pruner" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.526777 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.529467 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.529722 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.541977 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.592808 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.593296 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.593392 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.694764 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.695718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.695844 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.695810 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.695645 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.715672 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") pod \"installer-9-crc\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:19 crc kubenswrapper[4774]: I0127 00:10:19.848967 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:10:20 crc kubenswrapper[4774]: I0127 00:10:20.274566 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 00:10:20 crc kubenswrapper[4774]: W0127 00:10:20.282559 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd05ddc41_daa1_4deb_a5bf_d5b5f9181a1c.slice/crio-ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa WatchSource:0}: Error finding container ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa: Status 404 returned error can't find the container with id ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa Jan 27 00:10:20 crc kubenswrapper[4774]: I0127 00:10:20.418058 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c","Type":"ContainerStarted","Data":"ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa"} Jan 27 00:10:21 crc kubenswrapper[4774]: I0127 00:10:21.433015 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c","Type":"ContainerStarted","Data":"05e7a9c90f1cfc8896ce47e0dacfdae023353b22d601861957e2c1280d6e97a3"} Jan 27 00:10:21 crc kubenswrapper[4774]: I0127 00:10:21.435084 4774 generic.go:334] "Generic (PLEG): container finished" podID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerID="99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917" exitCode=0 Jan 27 00:10:21 crc kubenswrapper[4774]: I0127 00:10:21.435658 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerDied","Data":"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917"} Jan 27 00:10:21 crc kubenswrapper[4774]: I0127 00:10:21.455731 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.455709481 podStartE2EDuration="2.455709481s" podCreationTimestamp="2026-01-27 00:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:10:21.45255094 +0000 UTC m=+199.758327824" watchObservedRunningTime="2026-01-27 00:10:21.455709481 +0000 UTC m=+199.761486365" Jan 27 00:10:23 crc kubenswrapper[4774]: I0127 00:10:23.447194 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerStarted","Data":"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29"} Jan 27 00:10:23 crc kubenswrapper[4774]: I0127 00:10:23.449676 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerStarted","Data":"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92"} Jan 27 00:10:23 crc kubenswrapper[4774]: I0127 00:10:23.451457 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerStarted","Data":"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c"} Jan 27 00:10:23 crc kubenswrapper[4774]: I0127 00:10:23.453761 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerStarted","Data":"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.461667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerStarted","Data":"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.464342 4774 generic.go:334] "Generic (PLEG): container finished" podID="386f196b-c4bc-4fea-924b-c0487a352310" containerID="d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c" exitCode=0 Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.464417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerDied","Data":"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.467002 4774 generic.go:334] "Generic (PLEG): container finished" podID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerID="5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4" exitCode=0 Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.467082 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerDied","Data":"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.474310 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerID="72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29" exitCode=0 Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.474364 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerDied","Data":"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29"} Jan 27 00:10:24 crc kubenswrapper[4774]: I0127 00:10:24.484476 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ngr5v" podStartSLOduration=3.5101536380000002 podStartE2EDuration="48.484454948s" podCreationTimestamp="2026-01-27 00:09:36 +0000 UTC" firstStartedPulling="2026-01-27 00:09:37.98190041 +0000 UTC m=+156.287677294" lastFinishedPulling="2026-01-27 00:10:22.95620172 +0000 UTC m=+201.261978604" observedRunningTime="2026-01-27 00:10:23.544467321 +0000 UTC m=+201.850244205" watchObservedRunningTime="2026-01-27 00:10:24.484454948 +0000 UTC m=+202.790231832" Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.486286 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerStarted","Data":"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b"} Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.490022 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerStarted","Data":"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49"} Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.492476 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerStarted","Data":"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5"} Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.494292 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerID="29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb" exitCode=0 Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.494346 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerDied","Data":"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb"} Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.509122 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-97k6k" podStartSLOduration=2.720155274 podStartE2EDuration="46.509109024s" podCreationTimestamp="2026-01-27 00:09:39 +0000 UTC" firstStartedPulling="2026-01-27 00:09:41.072964503 +0000 UTC m=+159.378741387" lastFinishedPulling="2026-01-27 00:10:24.861918253 +0000 UTC m=+203.167695137" observedRunningTime="2026-01-27 00:10:25.507945984 +0000 UTC m=+203.813722868" watchObservedRunningTime="2026-01-27 00:10:25.509109024 +0000 UTC m=+203.814885908" Jan 27 00:10:25 crc kubenswrapper[4774]: I0127 00:10:25.527399 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mch4n" podStartSLOduration=2.705947161 podStartE2EDuration="46.527374661s" podCreationTimestamp="2026-01-27 00:09:39 +0000 UTC" firstStartedPulling="2026-01-27 00:09:41.067042783 +0000 UTC m=+159.372819667" lastFinishedPulling="2026-01-27 00:10:24.888470283 +0000 UTC m=+203.194247167" observedRunningTime="2026-01-27 00:10:25.525925194 +0000 UTC m=+203.831702078" watchObservedRunningTime="2026-01-27 00:10:25.527374661 +0000 UTC m=+203.833151545" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.504559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerStarted","Data":"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386"} Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.523750 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tlk6f" podStartSLOduration=2.341231838 podStartE2EDuration="48.523714842s" podCreationTimestamp="2026-01-27 00:09:38 +0000 UTC" firstStartedPulling="2026-01-27 00:09:40.027033463 +0000 UTC m=+158.332810347" lastFinishedPulling="2026-01-27 00:10:26.209516467 +0000 UTC m=+204.515293351" observedRunningTime="2026-01-27 00:10:26.523093085 +0000 UTC m=+204.828869969" watchObservedRunningTime="2026-01-27 00:10:26.523714842 +0000 UTC m=+204.829491726" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.529156 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kb8hq" podStartSLOduration=3.553429184 podStartE2EDuration="50.52914077s" podCreationTimestamp="2026-01-27 00:09:36 +0000 UTC" firstStartedPulling="2026-01-27 00:09:37.987272395 +0000 UTC m=+156.293049279" lastFinishedPulling="2026-01-27 00:10:24.962983981 +0000 UTC m=+203.268760865" observedRunningTime="2026-01-27 00:10:25.571151642 +0000 UTC m=+203.876928526" watchObservedRunningTime="2026-01-27 00:10:26.52914077 +0000 UTC m=+204.834917654" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.829343 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.829781 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:26 crc kubenswrapper[4774]: I0127 00:10:26.878489 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:27 crc kubenswrapper[4774]: I0127 00:10:27.060389 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:27 crc kubenswrapper[4774]: I0127 00:10:27.060462 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:27 crc kubenswrapper[4774]: I0127 00:10:27.122402 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:28 crc kubenswrapper[4774]: I0127 00:10:28.582946 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.038978 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.039030 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.076284 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.310661 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.652659 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:10:29 crc kubenswrapper[4774]: I0127 00:10:29.652717 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.025851 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.025933 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.525298 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ngr5v" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="registry-server" containerID="cri-o://7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" gracePeriod=2 Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.696026 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-97k6k" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" probeResult="failure" output=< Jan 27 00:10:30 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:10:30 crc kubenswrapper[4774]: > Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.864619 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.956140 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") pod \"5e621715-39b3-4094-b589-ede8b74b0e8f\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.956307 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") pod \"5e621715-39b3-4094-b589-ede8b74b0e8f\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.956403 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") pod \"5e621715-39b3-4094-b589-ede8b74b0e8f\" (UID: \"5e621715-39b3-4094-b589-ede8b74b0e8f\") " Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.957090 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities" (OuterVolumeSpecName: "utilities") pod "5e621715-39b3-4094-b589-ede8b74b0e8f" (UID: "5e621715-39b3-4094-b589-ede8b74b0e8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.959960 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:30 crc kubenswrapper[4774]: I0127 00:10:30.977092 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p" (OuterVolumeSpecName: "kube-api-access-nvt8p") pod "5e621715-39b3-4094-b589-ede8b74b0e8f" (UID: "5e621715-39b3-4094-b589-ede8b74b0e8f"). InnerVolumeSpecName "kube-api-access-nvt8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.011253 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e621715-39b3-4094-b589-ede8b74b0e8f" (UID: "5e621715-39b3-4094-b589-ede8b74b0e8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.061890 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvt8p\" (UniqueName: \"kubernetes.io/projected/5e621715-39b3-4094-b589-ede8b74b0e8f-kube-api-access-nvt8p\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.061955 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e621715-39b3-4094-b589-ede8b74b0e8f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.081550 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mch4n" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" probeResult="failure" output=< Jan 27 00:10:31 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:10:31 crc kubenswrapper[4774]: > Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532002 4774 generic.go:334] "Generic (PLEG): container finished" podID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerID="7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" exitCode=0 Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532059 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerDied","Data":"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92"} Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532073 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngr5v" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532098 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngr5v" event={"ID":"5e621715-39b3-4094-b589-ede8b74b0e8f","Type":"ContainerDied","Data":"fafa96ca772c719431784a78d0a35959db893c145e408b85c8cbdfa6a0078917"} Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.532118 4774 scope.go:117] "RemoveContainer" containerID="7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.550018 4774 scope.go:117] "RemoveContainer" containerID="99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.556727 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.559525 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ngr5v"] Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.580810 4774 scope.go:117] "RemoveContainer" containerID="5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.594048 4774 scope.go:117] "RemoveContainer" containerID="7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" Jan 27 00:10:31 crc kubenswrapper[4774]: E0127 00:10:31.595438 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92\": container with ID starting with 7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92 not found: ID does not exist" containerID="7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.595487 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92"} err="failed to get container status \"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92\": rpc error: code = NotFound desc = could not find container \"7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92\": container with ID starting with 7ae9cdab6889a8cc9c076b2f25450075428588d319410386e300ebe550071c92 not found: ID does not exist" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.595540 4774 scope.go:117] "RemoveContainer" containerID="99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917" Jan 27 00:10:31 crc kubenswrapper[4774]: E0127 00:10:31.596150 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917\": container with ID starting with 99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917 not found: ID does not exist" containerID="99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.596193 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917"} err="failed to get container status \"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917\": rpc error: code = NotFound desc = could not find container \"99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917\": container with ID starting with 99bf4a3c7e39920cef0a35d103fed1db6ce38dde31687975d6a14f8007d66917 not found: ID does not exist" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.596225 4774 scope.go:117] "RemoveContainer" containerID="5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202" Jan 27 00:10:31 crc kubenswrapper[4774]: E0127 00:10:31.596459 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202\": container with ID starting with 5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202 not found: ID does not exist" containerID="5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202" Jan 27 00:10:31 crc kubenswrapper[4774]: I0127 00:10:31.596481 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202"} err="failed to get container status \"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202\": rpc error: code = NotFound desc = could not find container \"5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202\": container with ID starting with 5fff880fcbd21c3e437c76288c0cf323028218e867d2f0fac59899494fb16202 not found: ID does not exist" Jan 27 00:10:32 crc kubenswrapper[4774]: I0127 00:10:32.362402 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" path="/var/lib/kubelet/pods/5e621715-39b3-4094-b589-ede8b74b0e8f/volumes" Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.675723 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.676091 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.676148 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.677200 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:10:36 crc kubenswrapper[4774]: I0127 00:10:36.677347 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85" gracePeriod=600 Jan 27 00:10:37 crc kubenswrapper[4774]: I0127 00:10:37.116390 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:37 crc kubenswrapper[4774]: I0127 00:10:37.586232 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85" exitCode=0 Jan 27 00:10:37 crc kubenswrapper[4774]: I0127 00:10:37.586593 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85"} Jan 27 00:10:37 crc kubenswrapper[4774]: I0127 00:10:37.586631 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5"} Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.087820 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.710072 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.710587 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kb8hq" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="registry-server" containerID="cri-o://778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" gracePeriod=2 Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.721530 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:10:39 crc kubenswrapper[4774]: I0127 00:10:39.769676 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.059666 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.067458 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.085799 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") pod \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.085869 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") pod \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.085952 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") pod \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\" (UID: \"f4f2c4e4-f111-453e-b953-cdf2528f9a3e\") " Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.089456 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities" (OuterVolumeSpecName: "utilities") pod "f4f2c4e4-f111-453e-b953-cdf2528f9a3e" (UID: "f4f2c4e4-f111-453e-b953-cdf2528f9a3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.096376 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g" (OuterVolumeSpecName: "kube-api-access-57w5g") pod "f4f2c4e4-f111-453e-b953-cdf2528f9a3e" (UID: "f4f2c4e4-f111-453e-b953-cdf2528f9a3e"). InnerVolumeSpecName "kube-api-access-57w5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.110225 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.138785 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4f2c4e4-f111-453e-b953-cdf2528f9a3e" (UID: "f4f2c4e4-f111-453e-b953-cdf2528f9a3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.187372 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.187402 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57w5g\" (UniqueName: \"kubernetes.io/projected/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-kube-api-access-57w5g\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.187413 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f2c4e4-f111-453e-b953-cdf2528f9a3e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601303 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerID="778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" exitCode=0 Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601368 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb8hq" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601381 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerDied","Data":"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5"} Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601430 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb8hq" event={"ID":"f4f2c4e4-f111-453e-b953-cdf2528f9a3e","Type":"ContainerDied","Data":"fee2b67cf24ce6f088a6fa76fdec56fa754242f0563e3da4ab6a39b52a665f76"} Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.601461 4774 scope.go:117] "RemoveContainer" containerID="778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.626123 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.633564 4774 scope.go:117] "RemoveContainer" containerID="72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.636389 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kb8hq"] Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.650919 4774 scope.go:117] "RemoveContainer" containerID="32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.667496 4774 scope.go:117] "RemoveContainer" containerID="778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" Jan 27 00:10:40 crc kubenswrapper[4774]: E0127 00:10:40.668060 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5\": container with ID starting with 778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5 not found: ID does not exist" containerID="778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.668134 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5"} err="failed to get container status \"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5\": rpc error: code = NotFound desc = could not find container \"778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5\": container with ID starting with 778f62d5f90758ddae5ea9193041445a79c82c0add4f2d4680131b366a664ab5 not found: ID does not exist" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.668182 4774 scope.go:117] "RemoveContainer" containerID="72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29" Jan 27 00:10:40 crc kubenswrapper[4774]: E0127 00:10:40.668777 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29\": container with ID starting with 72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29 not found: ID does not exist" containerID="72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.668806 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29"} err="failed to get container status \"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29\": rpc error: code = NotFound desc = could not find container \"72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29\": container with ID starting with 72bd755aa27a5d81b3f02b0a05eb2dd0ed5b9f409542a25787d13074331d2d29 not found: ID does not exist" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.668822 4774 scope.go:117] "RemoveContainer" containerID="32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf" Jan 27 00:10:40 crc kubenswrapper[4774]: E0127 00:10:40.669359 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf\": container with ID starting with 32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf not found: ID does not exist" containerID="32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf" Jan 27 00:10:40 crc kubenswrapper[4774]: I0127 00:10:40.669398 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf"} err="failed to get container status \"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf\": rpc error: code = NotFound desc = could not find container \"32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf\": container with ID starting with 32fab443e8c3584e78213a37102128904662b15ab75ac2e987868085e99936bf not found: ID does not exist" Jan 27 00:10:41 crc kubenswrapper[4774]: I0127 00:10:41.559776 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" containerID="cri-o://d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" gracePeriod=15 Jan 27 00:10:41 crc kubenswrapper[4774]: I0127 00:10:41.914038 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:10:41 crc kubenswrapper[4774]: I0127 00:10:41.914789 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tlk6f" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="registry-server" containerID="cri-o://0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" gracePeriod=2 Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.097235 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115260 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115371 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115409 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115452 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115489 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115535 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115608 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115651 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115714 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115760 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115799 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115847 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.115917 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.116029 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") pod \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\" (UID: \"f8bb8619-eac4-481e-bdb8-5fb5985c1844\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.116568 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.116991 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.117121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.117570 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.117752 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.129445 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp" (OuterVolumeSpecName: "kube-api-access-mh9kp") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "kube-api-access-mh9kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.130259 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.131525 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.132587 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.149139 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.151023 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.153058 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.154749 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.160200 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f8bb8619-eac4-481e-bdb8-5fb5985c1844" (UID: "f8bb8619-eac4-481e-bdb8-5fb5985c1844"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218386 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218441 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh9kp\" (UniqueName: \"kubernetes.io/projected/f8bb8619-eac4-481e-bdb8-5fb5985c1844-kube-api-access-mh9kp\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218462 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218481 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218494 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218508 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218522 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218538 4774 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218553 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218566 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218581 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218593 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f8bb8619-eac4-481e-bdb8-5fb5985c1844-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218606 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.218622 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f8bb8619-eac4-481e-bdb8-5fb5985c1844-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.315384 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.368829 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" path="/var/lib/kubelet/pods/f4f2c4e4-f111-453e-b953-cdf2528f9a3e/volumes" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.419926 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") pod \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.419998 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") pod \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.420053 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") pod \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\" (UID: \"b5f7850c-533c-48e5-bead-ddc0e7ba8d83\") " Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.421324 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities" (OuterVolumeSpecName: "utilities") pod "b5f7850c-533c-48e5-bead-ddc0e7ba8d83" (UID: "b5f7850c-533c-48e5-bead-ddc0e7ba8d83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.424249 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl" (OuterVolumeSpecName: "kube-api-access-g24zl") pod "b5f7850c-533c-48e5-bead-ddc0e7ba8d83" (UID: "b5f7850c-533c-48e5-bead-ddc0e7ba8d83"). InnerVolumeSpecName "kube-api-access-g24zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.448553 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5f7850c-533c-48e5-bead-ddc0e7ba8d83" (UID: "b5f7850c-533c-48e5-bead-ddc0e7ba8d83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.521445 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.521492 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g24zl\" (UniqueName: \"kubernetes.io/projected/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-kube-api-access-g24zl\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.521507 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5f7850c-533c-48e5-bead-ddc0e7ba8d83-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.615774 4774 generic.go:334] "Generic (PLEG): container finished" podID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerID="d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" exitCode=0 Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.615883 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" event={"ID":"f8bb8619-eac4-481e-bdb8-5fb5985c1844","Type":"ContainerDied","Data":"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973"} Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.615931 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.615986 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-brc4j" event={"ID":"f8bb8619-eac4-481e-bdb8-5fb5985c1844","Type":"ContainerDied","Data":"6c6a9cbe32487a19776f84a95c618edb7f8a9a884ffdc154828b802d8fe8a819"} Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.616025 4774 scope.go:117] "RemoveContainer" containerID="d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.625958 4774 generic.go:334] "Generic (PLEG): container finished" podID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerID="0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" exitCode=0 Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.626032 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerDied","Data":"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386"} Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.626073 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tlk6f" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.626081 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tlk6f" event={"ID":"b5f7850c-533c-48e5-bead-ddc0e7ba8d83","Type":"ContainerDied","Data":"ce3bedf60ecff127b9cf54aca7f49498f367bae4df2ca706885feb215a762b62"} Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.645547 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.650752 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-brc4j"] Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.663880 4774 scope.go:117] "RemoveContainer" containerID="d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" Jan 27 00:10:42 crc kubenswrapper[4774]: E0127 00:10:42.664682 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973\": container with ID starting with d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973 not found: ID does not exist" containerID="d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.664840 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973"} err="failed to get container status \"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973\": rpc error: code = NotFound desc = could not find container \"d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973\": container with ID starting with d0c2c08b9d6989a217b32e0d7e3342416038b36b787fb089ae3e076e19525973 not found: ID does not exist" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.665022 4774 scope.go:117] "RemoveContainer" containerID="0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.668193 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.670437 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tlk6f"] Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.683602 4774 scope.go:117] "RemoveContainer" containerID="29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.698149 4774 scope.go:117] "RemoveContainer" containerID="7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.712590 4774 scope.go:117] "RemoveContainer" containerID="0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" Jan 27 00:10:42 crc kubenswrapper[4774]: E0127 00:10:42.713028 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386\": container with ID starting with 0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386 not found: ID does not exist" containerID="0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713059 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386"} err="failed to get container status \"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386\": rpc error: code = NotFound desc = could not find container \"0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386\": container with ID starting with 0e5ec6402e86be1193b0b8a8eae57f5414003d4ff3db197b13e084ce9f0a5386 not found: ID does not exist" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713086 4774 scope.go:117] "RemoveContainer" containerID="29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb" Jan 27 00:10:42 crc kubenswrapper[4774]: E0127 00:10:42.713387 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb\": container with ID starting with 29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb not found: ID does not exist" containerID="29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713411 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb"} err="failed to get container status \"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb\": rpc error: code = NotFound desc = could not find container \"29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb\": container with ID starting with 29a4fdddb0d8c9c71a805f46f8ce2b85b53119c6e54e831778612b276e3c33bb not found: ID does not exist" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713426 4774 scope.go:117] "RemoveContainer" containerID="7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731" Jan 27 00:10:42 crc kubenswrapper[4774]: E0127 00:10:42.713894 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731\": container with ID starting with 7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731 not found: ID does not exist" containerID="7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731" Jan 27 00:10:42 crc kubenswrapper[4774]: I0127 00:10:42.713986 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731"} err="failed to get container status \"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731\": rpc error: code = NotFound desc = could not find container \"7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731\": container with ID starting with 7489ba5fbf97c7368c1a2053d17e98eca4a9b24f8c768b7172963311d3c82731 not found: ID does not exist" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.111066 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.111630 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mch4n" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" containerID="cri-o://c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" gracePeriod=2 Jan 27 00:10:44 crc kubenswrapper[4774]: E0127 00:10:44.195852 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e47ec75_abd8_41d4_a3c8_e722d21a305f.slice/crio-c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49.scope\": RecentStats: unable to find data in memory cache]" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.366641 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" path="/var/lib/kubelet/pods/b5f7850c-533c-48e5-bead-ddc0e7ba8d83/volumes" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.368359 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" path="/var/lib/kubelet/pods/f8bb8619-eac4-481e-bdb8-5fb5985c1844/volumes" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.575488 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644015 4774 generic.go:334] "Generic (PLEG): container finished" podID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerID="c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" exitCode=0 Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644064 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerDied","Data":"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49"} Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644101 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mch4n" event={"ID":"6e47ec75-abd8-41d4-a3c8-e722d21a305f","Type":"ContainerDied","Data":"1bc82424f292aa8d315ed22486552e2ce20a0278e67aebe2d803ffbc4e72f07c"} Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644126 4774 scope.go:117] "RemoveContainer" containerID="c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.644264 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mch4n" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.649749 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") pod \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.650494 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") pod \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.650660 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") pod \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\" (UID: \"6e47ec75-abd8-41d4-a3c8-e722d21a305f\") " Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.653235 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities" (OuterVolumeSpecName: "utilities") pod "6e47ec75-abd8-41d4-a3c8-e722d21a305f" (UID: "6e47ec75-abd8-41d4-a3c8-e722d21a305f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.660784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm" (OuterVolumeSpecName: "kube-api-access-w8rvm") pod "6e47ec75-abd8-41d4-a3c8-e722d21a305f" (UID: "6e47ec75-abd8-41d4-a3c8-e722d21a305f"). InnerVolumeSpecName "kube-api-access-w8rvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.667664 4774 scope.go:117] "RemoveContainer" containerID="5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.688688 4774 scope.go:117] "RemoveContainer" containerID="ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.704935 4774 scope.go:117] "RemoveContainer" containerID="c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" Jan 27 00:10:44 crc kubenswrapper[4774]: E0127 00:10:44.705351 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49\": container with ID starting with c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49 not found: ID does not exist" containerID="c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.705396 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49"} err="failed to get container status \"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49\": rpc error: code = NotFound desc = could not find container \"c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49\": container with ID starting with c114cfbf7a2ba0aa73af2c22e030486650def4b5a72a66430527127cd804cd49 not found: ID does not exist" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.705430 4774 scope.go:117] "RemoveContainer" containerID="5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4" Jan 27 00:10:44 crc kubenswrapper[4774]: E0127 00:10:44.705850 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4\": container with ID starting with 5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4 not found: ID does not exist" containerID="5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.705915 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4"} err="failed to get container status \"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4\": rpc error: code = NotFound desc = could not find container \"5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4\": container with ID starting with 5659c2a751dd985515a6d978375fdc4bdde8e8d281fb76b3e6c211407689e8d4 not found: ID does not exist" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.705935 4774 scope.go:117] "RemoveContainer" containerID="ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d" Jan 27 00:10:44 crc kubenswrapper[4774]: E0127 00:10:44.706264 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d\": container with ID starting with ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d not found: ID does not exist" containerID="ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.706296 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d"} err="failed to get container status \"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d\": rpc error: code = NotFound desc = could not find container \"ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d\": container with ID starting with ddaaddf044bbac40b3b3b040c54a3dbe8fedcdbee71125bf95972b6f774adb4d not found: ID does not exist" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.753522 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.753575 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8rvm\" (UniqueName: \"kubernetes.io/projected/6e47ec75-abd8-41d4-a3c8-e722d21a305f-kube-api-access-w8rvm\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.796504 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e47ec75-abd8-41d4-a3c8-e722d21a305f" (UID: "6e47ec75-abd8-41d4-a3c8-e722d21a305f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.854571 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e47ec75-abd8-41d4-a3c8-e722d21a305f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.991355 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:10:44 crc kubenswrapper[4774]: I0127 00:10:44.996638 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mch4n"] Jan 27 00:10:46 crc kubenswrapper[4774]: I0127 00:10:46.365278 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" path="/var/lib/kubelet/pods/6e47ec75-abd8-41d4-a3c8-e722d21a305f/volumes" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633433 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8445cf6b-22hmn"] Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633671 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633686 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633700 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633709 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633722 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633731 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633746 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633754 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633767 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633775 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633792 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633802 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633820 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633831 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633843 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633882 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="extract-utilities" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633897 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633911 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633925 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633935 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633954 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633964 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.633981 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.633993 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="extract-content" Jan 27 00:10:48 crc kubenswrapper[4774]: E0127 00:10:48.634004 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634014 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634164 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f2c4e4-f111-453e-b953-cdf2528f9a3e" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634187 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e47ec75-abd8-41d4-a3c8-e722d21a305f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634211 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bb8619-eac4-481e-bdb8-5fb5985c1844" containerName="oauth-openshift" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634224 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e621715-39b3-4094-b589-ede8b74b0e8f" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634241 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f7850c-533c-48e5-bead-ddc0e7ba8d83" containerName="registry-server" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.634754 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.639360 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.640290 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.640565 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.640853 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.641140 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.641495 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.642156 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.642286 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.642498 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.642767 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.644965 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.648233 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.656368 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8445cf6b-22hmn"] Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.657841 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.661036 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.662688 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712198 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-policies\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712257 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rmn\" (UniqueName: \"kubernetes.io/projected/ba9690af-8b7d-4404-93d7-87b2de9bd203-kube-api-access-b5rmn\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712306 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712332 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-login\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712364 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712435 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712480 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-dir\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712547 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712571 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-router-certs\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712598 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712623 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-error\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712649 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-session\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.712730 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-service-ca\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814476 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-policies\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814533 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rmn\" (UniqueName: \"kubernetes.io/projected/ba9690af-8b7d-4404-93d7-87b2de9bd203-kube-api-access-b5rmn\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814569 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-login\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814586 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814609 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814635 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-dir\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814694 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814712 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-router-certs\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814750 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814769 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-error\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814785 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-session\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.814802 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-service-ca\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.815750 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-service-ca\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.816096 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-policies\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.817150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.817833 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba9690af-8b7d-4404-93d7-87b2de9bd203-audit-dir\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.819147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.822780 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-router-certs\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.822951 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-login\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.823074 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.824576 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.824618 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.824710 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-system-session\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.826437 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.827617 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba9690af-8b7d-4404-93d7-87b2de9bd203-v4-0-config-user-template-error\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.839117 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rmn\" (UniqueName: \"kubernetes.io/projected/ba9690af-8b7d-4404-93d7-87b2de9bd203-kube-api-access-b5rmn\") pod \"oauth-openshift-8445cf6b-22hmn\" (UID: \"ba9690af-8b7d-4404-93d7-87b2de9bd203\") " pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:48 crc kubenswrapper[4774]: I0127 00:10:48.983919 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.265833 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8445cf6b-22hmn"] Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.691294 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" event={"ID":"ba9690af-8b7d-4404-93d7-87b2de9bd203","Type":"ContainerStarted","Data":"d5c16dbbc94e1e3e428a80a4db9bfb7a6af9932fdaed30a170c2fa3ef5cb0ad2"} Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.691362 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" event={"ID":"ba9690af-8b7d-4404-93d7-87b2de9bd203","Type":"ContainerStarted","Data":"1cf87edd0cc0b54623f996856a7b9303f51103b7b1bc70d8d71d7c00339b3fa0"} Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.691530 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.693257 4774 patch_prober.go:28] interesting pod/oauth-openshift-8445cf6b-22hmn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" start-of-body= Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.693303 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" podUID="ba9690af-8b7d-4404-93d7-87b2de9bd203" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" Jan 27 00:10:49 crc kubenswrapper[4774]: I0127 00:10:49.712890 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" podStartSLOduration=33.71285468 podStartE2EDuration="33.71285468s" podCreationTimestamp="2026-01-27 00:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:10:49.709099023 +0000 UTC m=+228.014875907" watchObservedRunningTime="2026-01-27 00:10:49.71285468 +0000 UTC m=+228.018631564" Jan 27 00:10:50 crc kubenswrapper[4774]: I0127 00:10:50.700061 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8445cf6b-22hmn" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.572736 4774 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.574175 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575233 4774 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575572 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575602 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575752 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575831 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.575896 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" gracePeriod=15 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578278 4774 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578539 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578552 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578565 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578573 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578582 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578588 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578597 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578603 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578610 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578615 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578624 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578630 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578639 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578646 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578742 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578756 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578768 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578780 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578790 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578799 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.578911 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.578921 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.579030 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.640671 4774 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.27:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.659123 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.659294 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.659349 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.659442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.660026 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.767899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.767941 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.767974 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768008 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768036 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768072 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768089 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768203 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768224 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768251 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.768280 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.770517 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.772766 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773551 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" exitCode=0 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773568 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" exitCode=0 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773576 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" exitCode=0 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773583 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" exitCode=2 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.773643 4774 scope.go:117] "RemoveContainer" containerID="4a301ad1f523e6487020700b3d05571cece7ba68f8befd99c7c7b6ff7a8ec80c" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.785912 4774 generic.go:334] "Generic (PLEG): container finished" podID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" containerID="05e7a9c90f1cfc8896ce47e0dacfdae023353b22d601861957e2c1280d6e97a3" exitCode=0 Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.785983 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c","Type":"ContainerDied","Data":"05e7a9c90f1cfc8896ce47e0dacfdae023353b22d601861957e2c1280d6e97a3"} Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.787534 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.788368 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869648 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869704 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869782 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869792 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.869803 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: I0127 00:10:58.943082 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:58 crc kubenswrapper[4774]: E0127 00:10:58.981134 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e6e021e22ae84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,LastTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:10:59 crc kubenswrapper[4774]: E0127 00:10:59.443384 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e6e021e22ae84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,LastTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.794232 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2"} Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.794322 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"09beeb18b226da275eae17f55abb5dddd6b7c6f8844e81dd809b2041a4715d87"} Jan 27 00:10:59 crc kubenswrapper[4774]: E0127 00:10:59.795394 4774 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.27:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.796697 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.797521 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:10:59 crc kubenswrapper[4774]: I0127 00:10:59.800673 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.192038 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.193095 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.194000 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293046 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") pod \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293116 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") pod \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293159 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") pod \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\" (UID: \"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c\") " Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293163 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" (UID: "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293417 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.293403 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock" (OuterVolumeSpecName: "var-lock") pod "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" (UID: "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.300704 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" (UID: "d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.394292 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.394621 4774 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.829157 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c","Type":"ContainerDied","Data":"ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa"} Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.829226 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed588f4d65707d0fe13e5dfdb082cdfddb42e10f2ae3fc8508a8ae1455271eaa" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.829313 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.834580 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.953661 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.954611 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.955157 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:00 crc kubenswrapper[4774]: I0127 00:11:00.955521 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104534 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104652 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104672 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104748 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104766 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.104901 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.105175 4774 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.105205 4774 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.105217 4774 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.843686 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.845100 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" exitCode=0 Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.845176 4774 scope.go:117] "RemoveContainer" containerID="03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.845456 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.865100 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.865548 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.875528 4774 scope.go:117] "RemoveContainer" containerID="c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.891114 4774 scope.go:117] "RemoveContainer" containerID="00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.913393 4774 scope.go:117] "RemoveContainer" containerID="0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.936262 4774 scope.go:117] "RemoveContainer" containerID="2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.967759 4774 scope.go:117] "RemoveContainer" containerID="b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.991489 4774 scope.go:117] "RemoveContainer" containerID="03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.992357 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\": container with ID starting with 03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a not found: ID does not exist" containerID="03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.992408 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a"} err="failed to get container status \"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\": rpc error: code = NotFound desc = could not find container \"03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a\": container with ID starting with 03a8760866ae91a5b6835ff0c72892c476ed1f8897fade476444be7c33c8385a not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.992444 4774 scope.go:117] "RemoveContainer" containerID="c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.993013 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\": container with ID starting with c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba not found: ID does not exist" containerID="c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.993039 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba"} err="failed to get container status \"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\": rpc error: code = NotFound desc = could not find container \"c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba\": container with ID starting with c13d0dd8c5a39c26bf7dfd4b2754ecd075769b92f3d6c50177c677c2a0c9b9ba not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.993055 4774 scope.go:117] "RemoveContainer" containerID="00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.995132 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\": container with ID starting with 00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d not found: ID does not exist" containerID="00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.995181 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d"} err="failed to get container status \"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\": rpc error: code = NotFound desc = could not find container \"00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d\": container with ID starting with 00b306c65454512226db413a6d8e77283d8b94557c8c19e948bb390c9ca5fa3d not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.995236 4774 scope.go:117] "RemoveContainer" containerID="0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.996003 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\": container with ID starting with 0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203 not found: ID does not exist" containerID="0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.996030 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203"} err="failed to get container status \"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\": rpc error: code = NotFound desc = could not find container \"0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203\": container with ID starting with 0698376585e48338a635465c1495d0baac7dbfc901921eb9a4cdd8c53de68203 not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.996048 4774 scope.go:117] "RemoveContainer" containerID="2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.996389 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\": container with ID starting with 2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98 not found: ID does not exist" containerID="2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.996421 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98"} err="failed to get container status \"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\": rpc error: code = NotFound desc = could not find container \"2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98\": container with ID starting with 2a417f175e5da6e8a1f783863f4d47248385b874598ac779259cd0fe61d3ce98 not found: ID does not exist" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.996463 4774 scope.go:117] "RemoveContainer" containerID="b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea" Jan 27 00:11:01 crc kubenswrapper[4774]: E0127 00:11:01.997016 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\": container with ID starting with b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea not found: ID does not exist" containerID="b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea" Jan 27 00:11:01 crc kubenswrapper[4774]: I0127 00:11:01.997038 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea"} err="failed to get container status \"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\": rpc error: code = NotFound desc = could not find container \"b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea\": container with ID starting with b55258601797bf968c3582feb3040311cce5c2ec0a4ae0f32608eea46ab137ea not found: ID does not exist" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.008612 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.009218 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.009666 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.011032 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.011705 4774 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: I0127 00:11:02.011788 4774 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.012290 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="200ms" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.214248 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="400ms" Jan 27 00:11:02 crc kubenswrapper[4774]: I0127 00:11:02.358493 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: I0127 00:11:02.358811 4774 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:02 crc kubenswrapper[4774]: I0127 00:11:02.362976 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 27 00:11:02 crc kubenswrapper[4774]: E0127 00:11:02.615084 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="800ms" Jan 27 00:11:03 crc kubenswrapper[4774]: E0127 00:11:03.416625 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="1.6s" Jan 27 00:11:05 crc kubenswrapper[4774]: E0127 00:11:05.020221 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="3.2s" Jan 27 00:11:08 crc kubenswrapper[4774]: E0127 00:11:08.221442 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.27:6443: connect: connection refused" interval="6.4s" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.357623 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.359834 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.379972 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.380058 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:09 crc kubenswrapper[4774]: E0127 00:11:09.380636 4774 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.381306 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:09 crc kubenswrapper[4774]: W0127 00:11:09.428736 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-a47f3d24ba1d361569d2ac9734e7bb855c5a7738964ae3f806411cc89036bacd WatchSource:0}: Error finding container a47f3d24ba1d361569d2ac9734e7bb855c5a7738964ae3f806411cc89036bacd: Status 404 returned error can't find the container with id a47f3d24ba1d361569d2ac9734e7bb855c5a7738964ae3f806411cc89036bacd Jan 27 00:11:09 crc kubenswrapper[4774]: E0127 00:11:09.445776 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.27:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e6e021e22ae84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,LastTimestamp:2026-01-27 00:10:58.977762948 +0000 UTC m=+237.283539862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.914497 4774 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="bc27f6ca9a0b7ed3fe2850a1707feaeaa157fc862b1e65188fbbf91a7c9ea542" exitCode=0 Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.914600 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"bc27f6ca9a0b7ed3fe2850a1707feaeaa157fc862b1e65188fbbf91a7c9ea542"} Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.915732 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a47f3d24ba1d361569d2ac9734e7bb855c5a7738964ae3f806411cc89036bacd"} Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.916427 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.916462 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:09 crc kubenswrapper[4774]: E0127 00:11:09.917167 4774 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:09 crc kubenswrapper[4774]: I0127 00:11:09.917178 4774 status_manager.go:851] "Failed to get status for pod" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.27:6443: connect: connection refused" Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.842691 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.843174 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.934646 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dfd2a33eeb4aa36e42ad8aff35ecce104aee6e7b9baca6b4d497997968743be3"} Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.934723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77e714e2f02139e503acd2d790cb3eecf1694e951047d996b7024d01ac261e79"} Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.934741 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"21b062e8c053cd29472647f0280ccb7d34b7646039e3bdc0ed57b8bad7fb9c0b"} Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.939889 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.939954 4774 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78" exitCode=1 Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.940004 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78"} Jan 27 00:11:10 crc kubenswrapper[4774]: I0127 00:11:10.940708 4774 scope.go:117] "RemoveContainer" containerID="3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78" Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.952179 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.952497 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c"} Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957016 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"41dcd7de625f79174e76b9152db0462b80dd3b5317720b3b2640da26297b4d8a"} Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957102 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85ebfcfdcc132ec8a43257ac4dbc0c76d1f12a6df553ce3bc9cdcc3dd0ae7e31"} Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957207 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957247 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:11 crc kubenswrapper[4774]: I0127 00:11:11.957272 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:14 crc kubenswrapper[4774]: I0127 00:11:14.382210 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:14 crc kubenswrapper[4774]: I0127 00:11:14.382282 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:14 crc kubenswrapper[4774]: I0127 00:11:14.390664 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:15 crc kubenswrapper[4774]: I0127 00:11:15.463353 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:11:16 crc kubenswrapper[4774]: I0127 00:11:16.968432 4774 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:16 crc kubenswrapper[4774]: I0127 00:11:16.994152 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:16 crc kubenswrapper[4774]: I0127 00:11:16.994178 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:16 crc kubenswrapper[4774]: I0127 00:11:16.999601 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:17 crc kubenswrapper[4774]: I0127 00:11:17.003476 4774 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="282a1d53-c7c3-4138-866e-4d597661333a" Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.000704 4774 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.000749 4774 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a62b292b-334d-470c-97c4-b86640ebc5bf" Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.254821 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.255119 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 27 00:11:18 crc kubenswrapper[4774]: I0127 00:11:18.255186 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 27 00:11:22 crc kubenswrapper[4774]: I0127 00:11:22.377961 4774 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="282a1d53-c7c3-4138-866e-4d597661333a" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.141787 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.266488 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.447491 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.635783 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 00:11:23 crc kubenswrapper[4774]: I0127 00:11:23.958777 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.295285 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.423484 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.599106 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.779545 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.928351 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.933160 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 00:11:24 crc kubenswrapper[4774]: I0127 00:11:24.950205 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.019048 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.335846 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.545134 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.730457 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 00:11:25 crc kubenswrapper[4774]: I0127 00:11:25.971040 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.110904 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.111287 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.266171 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.301274 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.464311 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 00:11:26 crc kubenswrapper[4774]: I0127 00:11:26.861924 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.078367 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.442591 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.566144 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.580955 4774 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 00:11:27 crc kubenswrapper[4774]: I0127 00:11:27.659697 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.025502 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.194144 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.255173 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.255288 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 27 00:11:28 crc kubenswrapper[4774]: I0127 00:11:28.744894 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 00:11:29 crc kubenswrapper[4774]: I0127 00:11:29.452191 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 00:11:29 crc kubenswrapper[4774]: I0127 00:11:29.517235 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 00:11:29 crc kubenswrapper[4774]: I0127 00:11:29.859666 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.327845 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.537497 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.666683 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.753928 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.908576 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.912852 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 00:11:30 crc kubenswrapper[4774]: I0127 00:11:30.949238 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.150084 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.159212 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.532289 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.533472 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.593750 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 00:11:31 crc kubenswrapper[4774]: I0127 00:11:31.750145 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.136969 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.280061 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.340448 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.451147 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.484036 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.638225 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.658903 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.690835 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.825612 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.883714 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 00:11:32 crc kubenswrapper[4774]: I0127 00:11:32.973927 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.044673 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.338698 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.499799 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.596334 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.614264 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.664924 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.693120 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.712749 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.753032 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.804251 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.938556 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.938733 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 00:11:33 crc kubenswrapper[4774]: I0127 00:11:33.955050 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.001083 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.085414 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.153947 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.200452 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.298578 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.442781 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.510795 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.624756 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.682034 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.687321 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.688907 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.751819 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.801434 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.815755 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.920516 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.933749 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.969023 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 00:11:34 crc kubenswrapper[4774]: I0127 00:11:34.986331 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.087080 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.119498 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.212131 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.214045 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.224222 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.289369 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.378468 4774 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.389022 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.389095 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.395466 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.396982 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.435474 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.436568 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.436545893999998 podStartE2EDuration="19.436545894s" podCreationTimestamp="2026-01-27 00:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:11:35.414832771 +0000 UTC m=+273.720609655" watchObservedRunningTime="2026-01-27 00:11:35.436545894 +0000 UTC m=+273.742322768" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.520150 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.608620 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.666795 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.669266 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.674912 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.791427 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.855178 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.871781 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.878668 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.894667 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.983833 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 00:11:35 crc kubenswrapper[4774]: I0127 00:11:35.990371 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.011801 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.083416 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.326670 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.375797 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.502659 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.545452 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.643319 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.663690 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.674852 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.745849 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.877315 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 00:11:36 crc kubenswrapper[4774]: I0127 00:11:36.937990 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.024517 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.129170 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.129736 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.327571 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.363593 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.396491 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.403136 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.420633 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.515394 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.618903 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.688589 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.704385 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.740796 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.816772 4774 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.882318 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.926090 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.930012 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 00:11:37 crc kubenswrapper[4774]: I0127 00:11:37.985385 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.129329 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.141270 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.149644 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.165812 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.233087 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.248511 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.254767 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.254900 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.254983 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.255801 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.255980 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c" gracePeriod=30 Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.263787 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.288692 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.308226 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.344481 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.407155 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.433609 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.575720 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.606743 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.632521 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.646872 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.664743 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.822659 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.827608 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.832778 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.928944 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.966944 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 00:11:38 crc kubenswrapper[4774]: I0127 00:11:38.979900 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.113422 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.132710 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.139040 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.258201 4774 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.294597 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.404809 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.477626 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.478326 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.491223 4774 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.491441 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" gracePeriod=5 Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.881964 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 00:11:39 crc kubenswrapper[4774]: I0127 00:11:39.913236 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.059501 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.147838 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.153818 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.191249 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.298889 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.328035 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.401095 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.422051 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.562812 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.705316 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.734811 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.792232 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 00:11:40 crc kubenswrapper[4774]: I0127 00:11:40.872945 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.004919 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.042003 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.053052 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.075763 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.141238 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.220247 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.281399 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.317905 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.358140 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.413396 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.452469 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.487083 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.489976 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.511908 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.687287 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.728769 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.777549 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.779845 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.866986 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.941349 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 00:11:41 crc kubenswrapper[4774]: I0127 00:11:41.981046 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.005790 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.018193 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.035206 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.040160 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.157587 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.171318 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.215266 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.263616 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.324532 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.339022 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.364391 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.378120 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.490526 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.549439 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.859897 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.891456 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 00:11:42 crc kubenswrapper[4774]: I0127 00:11:42.998003 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.033458 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.065654 4774 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.120212 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.195313 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.196086 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.480905 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.868303 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.977634 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 00:11:43 crc kubenswrapper[4774]: I0127 00:11:43.985130 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.039796 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.139572 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.256930 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.444753 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.593529 4774 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.639707 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.795625 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.972476 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 00:11:44 crc kubenswrapper[4774]: I0127 00:11:44.973957 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.062676 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.062746 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.070016 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.138295 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.167205 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.167905 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.167949 4774 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" exitCode=137 Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.168005 4774 scope.go:117] "RemoveContainer" containerID="274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.168043 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.198028 4774 scope.go:117] "RemoveContainer" containerID="274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" Jan 27 00:11:45 crc kubenswrapper[4774]: E0127 00:11:45.198919 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2\": container with ID starting with 274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2 not found: ID does not exist" containerID="274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.198977 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2"} err="failed to get container status \"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2\": rpc error: code = NotFound desc = could not find container \"274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2\": container with ID starting with 274b25157071a9a3eb70333d668d2151ab8bb92d1e668f32b659480e7730f4f2 not found: ID does not exist" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.206352 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216758 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216804 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216844 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216892 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.216917 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217002 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217067 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217093 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217181 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217566 4774 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217613 4774 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217626 4774 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.217638 4774 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.225548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.250518 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.318675 4774 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.724344 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 00:11:45 crc kubenswrapper[4774]: I0127 00:11:45.920595 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 00:11:46 crc kubenswrapper[4774]: I0127 00:11:46.020589 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 00:11:46 crc kubenswrapper[4774]: I0127 00:11:46.241946 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 00:11:46 crc kubenswrapper[4774]: I0127 00:11:46.363563 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 27 00:11:48 crc kubenswrapper[4774]: I0127 00:11:48.032977 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 00:12:02 crc kubenswrapper[4774]: I0127 00:12:02.145743 4774 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.316393 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318690 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318742 4774 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c" exitCode=137 Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1df932f38c8736acf15a35fdc2454e2dffe503f54a5970b6785ed2254a84337c"} Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318800 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b5a2ea12dbf518bffc634af342a252a1cefda672b27fbb48f778587800cc58c7"} Jan 27 00:12:09 crc kubenswrapper[4774]: I0127 00:12:09.318818 4774 scope.go:117] "RemoveContainer" containerID="3c7f6a3552de34a951232ae8975719d2abc371141413741724ccc9b597758a78" Jan 27 00:12:10 crc kubenswrapper[4774]: I0127 00:12:10.326189 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Jan 27 00:12:15 crc kubenswrapper[4774]: I0127 00:12:15.463442 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:12:18 crc kubenswrapper[4774]: I0127 00:12:18.254382 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:12:18 crc kubenswrapper[4774]: I0127 00:12:18.259924 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:12:25 crc kubenswrapper[4774]: I0127 00:12:25.467667 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.444673 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.445933 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerName="route-controller-manager" containerID="cri-o://6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" gracePeriod=30 Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.455873 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.456442 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerName="controller-manager" containerID="cri-o://475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" gracePeriod=30 Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.866569 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.871968 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.944997 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945069 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") pod \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945105 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945125 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") pod \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945146 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945171 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") pod \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945312 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") pod \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\" (UID: \"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945329 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945360 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") pod \"9166b710-4bb0-4fc0-8e54-45907543c22f\" (UID: \"9166b710-4bb0-4fc0-8e54-45907543c22f\") " Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945854 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.945900 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca" (OuterVolumeSpecName: "client-ca") pod "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" (UID: "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.946634 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config" (OuterVolumeSpecName: "config") pod "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" (UID: "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.946732 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca" (OuterVolumeSpecName: "client-ca") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.947079 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config" (OuterVolumeSpecName: "config") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.950795 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.950894 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" (UID: "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.951266 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx" (OuterVolumeSpecName: "kube-api-access-4lxgx") pod "9166b710-4bb0-4fc0-8e54-45907543c22f" (UID: "9166b710-4bb0-4fc0-8e54-45907543c22f"). InnerVolumeSpecName "kube-api-access-4lxgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:12:27 crc kubenswrapper[4774]: I0127 00:12:27.951348 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps" (OuterVolumeSpecName: "kube-api-access-npcps") pod "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" (UID: "8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff"). InnerVolumeSpecName "kube-api-access-npcps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046176 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046213 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npcps\" (UniqueName: \"kubernetes.io/projected/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-kube-api-access-npcps\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046222 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9166b710-4bb0-4fc0-8e54-45907543c22f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046230 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lxgx\" (UniqueName: \"kubernetes.io/projected/9166b710-4bb0-4fc0-8e54-45907543c22f-kube-api-access-4lxgx\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046239 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046249 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046257 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046265 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.046274 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9166b710-4bb0-4fc0-8e54-45907543c22f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.447793 4774 generic.go:334] "Generic (PLEG): container finished" podID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerID="475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" exitCode=0 Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.447875 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" event={"ID":"9166b710-4bb0-4fc0-8e54-45907543c22f","Type":"ContainerDied","Data":"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c"} Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.447921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" event={"ID":"9166b710-4bb0-4fc0-8e54-45907543c22f","Type":"ContainerDied","Data":"20e7f45e0bc628b7de64cc93ac19fdef43b151e10efd40aab961cc3102eb973d"} Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.447958 4774 scope.go:117] "RemoveContainer" containerID="475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.448623 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5b8h8" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.448970 4774 generic.go:334] "Generic (PLEG): container finished" podID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerID="6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" exitCode=0 Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.448993 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" event={"ID":"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff","Type":"ContainerDied","Data":"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded"} Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.449036 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" event={"ID":"8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff","Type":"ContainerDied","Data":"63fe7051984d9e95bbe56ab4890d8824695c38cf0105a3b96dcf3496f186c863"} Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.449151 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.465284 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.466065 4774 scope.go:117] "RemoveContainer" containerID="475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.466548 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c\": container with ID starting with 475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c not found: ID does not exist" containerID="475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.466595 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c"} err="failed to get container status \"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c\": rpc error: code = NotFound desc = could not find container \"475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c\": container with ID starting with 475dcce3ef352f2be3a44b71ba3a1d6f9b156fff76ad1be0e11e18804994ff3c not found: ID does not exist" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.466623 4774 scope.go:117] "RemoveContainer" containerID="6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.469475 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gthj4"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.479330 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.480119 4774 scope.go:117] "RemoveContainer" containerID="6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.480457 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded\": container with ID starting with 6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded not found: ID does not exist" containerID="6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.480493 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded"} err="failed to get container status \"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded\": rpc error: code = NotFound desc = could not find container \"6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded\": container with ID starting with 6342f9f1266217ff76871b71b94a1a4126ff622479833de3c4d3c69900091ded not found: ID does not exist" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.482933 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5b8h8"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.698998 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-759c8d74f8-htq9z"] Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.699620 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerName="route-controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.699662 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerName="route-controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.699690 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.699705 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.699727 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" containerName="installer" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.699741 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" containerName="installer" Jan 27 00:12:28 crc kubenswrapper[4774]: E0127 00:12:28.699782 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerName="controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.699796 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerName="controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.700014 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05ddc41-daa1-4deb-a5bf-d5b5f9181a1c" containerName="installer" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.700037 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.700058 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" containerName="route-controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.700083 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" containerName="controller-manager" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.701009 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.703715 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.703847 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.703898 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.703937 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.704790 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.705352 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.705899 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.706328 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.713674 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.715989 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.716153 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.716198 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.716283 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.716361 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.722687 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-759c8d74f8-htq9z"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759112 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5202c840-a2f1-48c0-884f-9c74c38daa3b-serving-cert\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759188 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-config\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrgf9\" (UniqueName: \"kubernetes.io/projected/5202c840-a2f1-48c0-884f-9c74c38daa3b-kube-api-access-nrgf9\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759407 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-proxy-ca-bundles\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.759459 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-client-ca\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.762746 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.809380 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861419 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5202c840-a2f1-48c0-884f-9c74c38daa3b-serving-cert\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861559 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-config\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861585 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrgf9\" (UniqueName: \"kubernetes.io/projected/5202c840-a2f1-48c0-884f-9c74c38daa3b-kube-api-access-nrgf9\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861622 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861655 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861724 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-proxy-ca-bundles\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861763 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-client-ca\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.861824 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.864137 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-config\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.864157 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-client-ca\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.864254 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5202c840-a2f1-48c0-884f-9c74c38daa3b-proxy-ca-bundles\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.884929 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5202c840-a2f1-48c0-884f-9c74c38daa3b-serving-cert\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.886674 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrgf9\" (UniqueName: \"kubernetes.io/projected/5202c840-a2f1-48c0-884f-9c74c38daa3b-kube-api-access-nrgf9\") pod \"controller-manager-759c8d74f8-htq9z\" (UID: \"5202c840-a2f1-48c0-884f-9c74c38daa3b\") " pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.963388 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.963443 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.963463 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.963520 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.964351 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.964564 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.967014 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:28 crc kubenswrapper[4774]: I0127 00:12:28.979371 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") pod \"route-controller-manager-76d65966c5-vj2sn\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.063352 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.084449 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.380410 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.455071 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" event={"ID":"4240439b-6b80-4a41-960f-8629057b254a","Type":"ContainerStarted","Data":"a889b3e4423ddcb596fb765d9b93f601d997c73d3159d03696ff4a486f6a59c1"} Jan 27 00:12:29 crc kubenswrapper[4774]: I0127 00:12:29.516934 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-759c8d74f8-htq9z"] Jan 27 00:12:29 crc kubenswrapper[4774]: W0127 00:12:29.524211 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5202c840_a2f1_48c0_884f_9c74c38daa3b.slice/crio-71af2469ad116517b177fec4ef5e13b35f3c8a801cb947de1df4a38e9d5e6b61 WatchSource:0}: Error finding container 71af2469ad116517b177fec4ef5e13b35f3c8a801cb947de1df4a38e9d5e6b61: Status 404 returned error can't find the container with id 71af2469ad116517b177fec4ef5e13b35f3c8a801cb947de1df4a38e9d5e6b61 Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.363413 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff" path="/var/lib/kubelet/pods/8d6a02f8-e5e6-49b7-99cb-29d40ba9fdff/volumes" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.364299 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9166b710-4bb0-4fc0-8e54-45907543c22f" path="/var/lib/kubelet/pods/9166b710-4bb0-4fc0-8e54-45907543c22f/volumes" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.462606 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" event={"ID":"5202c840-a2f1-48c0-884f-9c74c38daa3b","Type":"ContainerStarted","Data":"f39e82d9fd3aa879ae91733f29d3a052848b6be8772b3db358d6519035089c4d"} Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.462650 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" event={"ID":"5202c840-a2f1-48c0-884f-9c74c38daa3b","Type":"ContainerStarted","Data":"71af2469ad116517b177fec4ef5e13b35f3c8a801cb947de1df4a38e9d5e6b61"} Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.462844 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.465668 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" event={"ID":"4240439b-6b80-4a41-960f-8629057b254a","Type":"ContainerStarted","Data":"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b"} Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.465925 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.470787 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.472584 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.491278 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-759c8d74f8-htq9z" podStartSLOduration=3.491261171 podStartE2EDuration="3.491261171s" podCreationTimestamp="2026-01-27 00:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:12:30.485657572 +0000 UTC m=+328.791434456" watchObservedRunningTime="2026-01-27 00:12:30.491261171 +0000 UTC m=+328.797038055" Jan 27 00:12:30 crc kubenswrapper[4774]: I0127 00:12:30.509192 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" podStartSLOduration=3.50917262 podStartE2EDuration="3.50917262s" podCreationTimestamp="2026-01-27 00:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:12:30.504786042 +0000 UTC m=+328.810562936" watchObservedRunningTime="2026-01-27 00:12:30.50917262 +0000 UTC m=+328.814949494" Jan 27 00:12:36 crc kubenswrapper[4774]: I0127 00:12:36.675613 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:12:36 crc kubenswrapper[4774]: I0127 00:12:36.677349 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.136592 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hdn9m"] Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.137671 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.147998 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hdn9m"] Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.275552 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.275720 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-registry-tls\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276422 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-registry-certificates\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276637 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7737e34-de67-4910-b082-a754145009df-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276711 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7737e34-de67-4910-b082-a754145009df-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276760 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-bound-sa-token\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276805 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-trusted-ca\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.276845 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x49b\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-kube-api-access-4x49b\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.312836 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379521 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-registry-certificates\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7737e34-de67-4910-b082-a754145009df-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379803 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7737e34-de67-4910-b082-a754145009df-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379824 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-bound-sa-token\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379840 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-trusted-ca\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379870 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x49b\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-kube-api-access-4x49b\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.379919 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-registry-tls\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.380303 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7737e34-de67-4910-b082-a754145009df-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.381059 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-registry-certificates\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.381176 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7737e34-de67-4910-b082-a754145009df-trusted-ca\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.394680 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-registry-tls\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.394678 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7737e34-de67-4910-b082-a754145009df-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.396714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-bound-sa-token\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.398223 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x49b\" (UniqueName: \"kubernetes.io/projected/d7737e34-de67-4910-b082-a754145009df-kube-api-access-4x49b\") pod \"image-registry-66df7c8f76-hdn9m\" (UID: \"d7737e34-de67-4910-b082-a754145009df\") " pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.454126 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:52 crc kubenswrapper[4774]: I0127 00:12:52.853475 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hdn9m"] Jan 27 00:12:52 crc kubenswrapper[4774]: W0127 00:12:52.862704 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7737e34_de67_4910_b082_a754145009df.slice/crio-1aa43303ac8854b242d38ce0e0abe34f9332e659e4be4bef5673b9f40595aa2c WatchSource:0}: Error finding container 1aa43303ac8854b242d38ce0e0abe34f9332e659e4be4bef5673b9f40595aa2c: Status 404 returned error can't find the container with id 1aa43303ac8854b242d38ce0e0abe34f9332e659e4be4bef5673b9f40595aa2c Jan 27 00:12:53 crc kubenswrapper[4774]: I0127 00:12:53.588567 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" event={"ID":"d7737e34-de67-4910-b082-a754145009df","Type":"ContainerStarted","Data":"a8252c2aab93f4db7b8b32b5a073eda6d90ee2c2f1951b46fe52073babd45079"} Jan 27 00:12:53 crc kubenswrapper[4774]: I0127 00:12:53.588629 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" event={"ID":"d7737e34-de67-4910-b082-a754145009df","Type":"ContainerStarted","Data":"1aa43303ac8854b242d38ce0e0abe34f9332e659e4be4bef5673b9f40595aa2c"} Jan 27 00:12:53 crc kubenswrapper[4774]: I0127 00:12:53.588725 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:12:53 crc kubenswrapper[4774]: I0127 00:12:53.604767 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" podStartSLOduration=1.60474297 podStartE2EDuration="1.60474297s" podCreationTimestamp="2026-01-27 00:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:12:53.603154688 +0000 UTC m=+351.908931592" watchObservedRunningTime="2026-01-27 00:12:53.60474297 +0000 UTC m=+351.910519864" Jan 27 00:13:06 crc kubenswrapper[4774]: I0127 00:13:06.675544 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:13:06 crc kubenswrapper[4774]: I0127 00:13:06.676606 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:13:07 crc kubenswrapper[4774]: I0127 00:13:07.780497 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:13:07 crc kubenswrapper[4774]: I0127 00:13:07.781328 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" podUID="4240439b-6b80-4a41-960f-8629057b254a" containerName="route-controller-manager" containerID="cri-o://a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" gracePeriod=30 Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.135578 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.212047 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") pod \"4240439b-6b80-4a41-960f-8629057b254a\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.212630 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") pod \"4240439b-6b80-4a41-960f-8629057b254a\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.213851 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") pod \"4240439b-6b80-4a41-960f-8629057b254a\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.214336 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca" (OuterVolumeSpecName: "client-ca") pod "4240439b-6b80-4a41-960f-8629057b254a" (UID: "4240439b-6b80-4a41-960f-8629057b254a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.214469 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config" (OuterVolumeSpecName: "config") pod "4240439b-6b80-4a41-960f-8629057b254a" (UID: "4240439b-6b80-4a41-960f-8629057b254a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.214487 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") pod \"4240439b-6b80-4a41-960f-8629057b254a\" (UID: \"4240439b-6b80-4a41-960f-8629057b254a\") " Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.215116 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.215575 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4240439b-6b80-4a41-960f-8629057b254a-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.221045 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4240439b-6b80-4a41-960f-8629057b254a" (UID: "4240439b-6b80-4a41-960f-8629057b254a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.222665 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7" (OuterVolumeSpecName: "kube-api-access-qjrc7") pod "4240439b-6b80-4a41-960f-8629057b254a" (UID: "4240439b-6b80-4a41-960f-8629057b254a"). InnerVolumeSpecName "kube-api-access-qjrc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.316669 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4240439b-6b80-4a41-960f-8629057b254a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.316718 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjrc7\" (UniqueName: \"kubernetes.io/projected/4240439b-6b80-4a41-960f-8629057b254a-kube-api-access-qjrc7\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672235 4774 generic.go:334] "Generic (PLEG): container finished" podID="4240439b-6b80-4a41-960f-8629057b254a" containerID="a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" exitCode=0 Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672304 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" event={"ID":"4240439b-6b80-4a41-960f-8629057b254a","Type":"ContainerDied","Data":"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b"} Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672346 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672368 4774 scope.go:117] "RemoveContainer" containerID="a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.672351 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn" event={"ID":"4240439b-6b80-4a41-960f-8629057b254a","Type":"ContainerDied","Data":"a889b3e4423ddcb596fb765d9b93f601d997c73d3159d03696ff4a486f6a59c1"} Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.693383 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.693684 4774 scope.go:117] "RemoveContainer" containerID="a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" Jan 27 00:13:08 crc kubenswrapper[4774]: E0127 00:13:08.694378 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b\": container with ID starting with a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b not found: ID does not exist" containerID="a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.694480 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b"} err="failed to get container status \"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b\": rpc error: code = NotFound desc = could not find container \"a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b\": container with ID starting with a7f5880fbdc2c819feece8228a5f65055d02811399e2f60ebe55d2bcc569692b not found: ID does not exist" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.696781 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76d65966c5-vj2sn"] Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.933440 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb"] Jan 27 00:13:08 crc kubenswrapper[4774]: E0127 00:13:08.933940 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4240439b-6b80-4a41-960f-8629057b254a" containerName="route-controller-manager" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.933966 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4240439b-6b80-4a41-960f-8629057b254a" containerName="route-controller-manager" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.934194 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4240439b-6b80-4a41-960f-8629057b254a" containerName="route-controller-manager" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.934970 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.937188 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.939670 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.940009 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.940250 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.940376 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.940463 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 00:13:08 crc kubenswrapper[4774]: I0127 00:13:08.943228 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb"] Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.026914 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-client-ca\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.026994 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbw4k\" (UniqueName: \"kubernetes.io/projected/b5957fa6-b497-4d76-9577-6b17bc66a6f4-kube-api-access-fbw4k\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.027035 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5957fa6-b497-4d76-9577-6b17bc66a6f4-serving-cert\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.027065 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-config\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.128832 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-client-ca\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.129177 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbw4k\" (UniqueName: \"kubernetes.io/projected/b5957fa6-b497-4d76-9577-6b17bc66a6f4-kube-api-access-fbw4k\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.129336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5957fa6-b497-4d76-9577-6b17bc66a6f4-serving-cert\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.129433 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-config\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.131144 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-config\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.131147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5957fa6-b497-4d76-9577-6b17bc66a6f4-client-ca\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.139573 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5957fa6-b497-4d76-9577-6b17bc66a6f4-serving-cert\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.147229 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbw4k\" (UniqueName: \"kubernetes.io/projected/b5957fa6-b497-4d76-9577-6b17bc66a6f4-kube-api-access-fbw4k\") pod \"route-controller-manager-6859b4ff9b-nswtb\" (UID: \"b5957fa6-b497-4d76-9577-6b17bc66a6f4\") " pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.257215 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:09 crc kubenswrapper[4774]: I0127 00:13:09.669386 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb"] Jan 27 00:13:09 crc kubenswrapper[4774]: W0127 00:13:09.689606 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5957fa6_b497_4d76_9577_6b17bc66a6f4.slice/crio-12c59ded9a3e4c29f9f1057dbe4235ceaf95756eae8df9a6010b4a67a0b75cb5 WatchSource:0}: Error finding container 12c59ded9a3e4c29f9f1057dbe4235ceaf95756eae8df9a6010b4a67a0b75cb5: Status 404 returned error can't find the container with id 12c59ded9a3e4c29f9f1057dbe4235ceaf95756eae8df9a6010b4a67a0b75cb5 Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.364254 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4240439b-6b80-4a41-960f-8629057b254a" path="/var/lib/kubelet/pods/4240439b-6b80-4a41-960f-8629057b254a/volumes" Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.710402 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" event={"ID":"b5957fa6-b497-4d76-9577-6b17bc66a6f4","Type":"ContainerStarted","Data":"c9125167004f2e4cacd5db93979dd557dab8dd6060e6b71674fa8b5880547acb"} Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.710468 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" event={"ID":"b5957fa6-b497-4d76-9577-6b17bc66a6f4","Type":"ContainerStarted","Data":"12c59ded9a3e4c29f9f1057dbe4235ceaf95756eae8df9a6010b4a67a0b75cb5"} Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.710994 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.717502 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" Jan 27 00:13:10 crc kubenswrapper[4774]: I0127 00:13:10.759840 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6859b4ff9b-nswtb" podStartSLOduration=3.759815139 podStartE2EDuration="3.759815139s" podCreationTimestamp="2026-01-27 00:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:13:10.738925321 +0000 UTC m=+369.044702215" watchObservedRunningTime="2026-01-27 00:13:10.759815139 +0000 UTC m=+369.065592043" Jan 27 00:13:12 crc kubenswrapper[4774]: I0127 00:13:12.464030 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hdn9m" Jan 27 00:13:12 crc kubenswrapper[4774]: I0127 00:13:12.569680 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.899269 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.900832 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pj24q" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="registry-server" containerID="cri-o://4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.928526 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.929023 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lnlbr" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="registry-server" containerID="cri-o://5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.941843 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.942101 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" containerID="cri-o://b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.951607 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.951887 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ds9fx" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="registry-server" containerID="cri-o://d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.966143 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.966550 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-97k6k" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" containerID="cri-o://e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" gracePeriod=30 Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.973696 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcz6w"] Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.974570 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:19 crc kubenswrapper[4774]: I0127 00:13:19.977515 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcz6w"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.030946 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wv2t\" (UniqueName: \"kubernetes.io/projected/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-kube-api-access-9wv2t\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.031135 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.031278 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.132115 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.132435 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.132618 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wv2t\" (UniqueName: \"kubernetes.io/projected/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-kube-api-access-9wv2t\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.134041 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.142085 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.154996 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wv2t\" (UniqueName: \"kubernetes.io/projected/fb20bbdd-0a13-4d07-8c4a-8c2285de3173-kube-api-access-9wv2t\") pod \"marketplace-operator-79b997595-bcz6w\" (UID: \"fb20bbdd-0a13-4d07-8c4a-8c2285de3173\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.347088 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.365045 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.437461 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") pod \"af002227-deb6-4a24-8ce5-051f93bc178b\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.442561 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") pod \"af002227-deb6-4a24-8ce5-051f93bc178b\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.442718 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") pod \"af002227-deb6-4a24-8ce5-051f93bc178b\" (UID: \"af002227-deb6-4a24-8ce5-051f93bc178b\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.450763 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities" (OuterVolumeSpecName: "utilities") pod "af002227-deb6-4a24-8ce5-051f93bc178b" (UID: "af002227-deb6-4a24-8ce5-051f93bc178b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.454009 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9" (OuterVolumeSpecName: "kube-api-access-qlfv9") pod "af002227-deb6-4a24-8ce5-051f93bc178b" (UID: "af002227-deb6-4a24-8ce5-051f93bc178b"). InnerVolumeSpecName "kube-api-access-qlfv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.455944 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.465637 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.494774 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.500385 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.545917 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") pod \"6cad24dd-acc4-40e5-8380-e0d74be79921\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.545975 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") pod \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546020 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") pod \"386f196b-c4bc-4fea-924b-c0487a352310\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546045 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") pod \"386f196b-c4bc-4fea-924b-c0487a352310\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546130 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") pod \"6cad24dd-acc4-40e5-8380-e0d74be79921\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546171 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") pod \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546188 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") pod \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546205 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") pod \"386f196b-c4bc-4fea-924b-c0487a352310\" (UID: \"386f196b-c4bc-4fea-924b-c0487a352310\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546237 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") pod \"6cad24dd-acc4-40e5-8380-e0d74be79921\" (UID: \"6cad24dd-acc4-40e5-8380-e0d74be79921\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546262 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") pod \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546290 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") pod \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\" (UID: \"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546319 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") pod \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\" (UID: \"687ff3b2-f773-482b-9bf7-1b6135b4d6ac\") " Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546527 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.546548 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlfv9\" (UniqueName: \"kubernetes.io/projected/af002227-deb6-4a24-8ce5-051f93bc178b-kube-api-access-qlfv9\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.547212 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities" (OuterVolumeSpecName: "utilities") pod "687ff3b2-f773-482b-9bf7-1b6135b4d6ac" (UID: "687ff3b2-f773-482b-9bf7-1b6135b4d6ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.548140 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities" (OuterVolumeSpecName: "utilities") pod "6cad24dd-acc4-40e5-8380-e0d74be79921" (UID: "6cad24dd-acc4-40e5-8380-e0d74be79921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.551332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr" (OuterVolumeSpecName: "kube-api-access-xh2lr") pod "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" (UID: "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72"). InnerVolumeSpecName "kube-api-access-xh2lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.552124 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities" (OuterVolumeSpecName: "utilities") pod "386f196b-c4bc-4fea-924b-c0487a352310" (UID: "386f196b-c4bc-4fea-924b-c0487a352310"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.554323 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6" (OuterVolumeSpecName: "kube-api-access-7bbh6") pod "6cad24dd-acc4-40e5-8380-e0d74be79921" (UID: "6cad24dd-acc4-40e5-8380-e0d74be79921"). InnerVolumeSpecName "kube-api-access-7bbh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.560969 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" (UID: "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.573295 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x" (OuterVolumeSpecName: "kube-api-access-vk44x") pod "386f196b-c4bc-4fea-924b-c0487a352310" (UID: "386f196b-c4bc-4fea-924b-c0487a352310"). InnerVolumeSpecName "kube-api-access-vk44x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.574401 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42" (OuterVolumeSpecName: "kube-api-access-lcf42") pod "687ff3b2-f773-482b-9bf7-1b6135b4d6ac" (UID: "687ff3b2-f773-482b-9bf7-1b6135b4d6ac"). InnerVolumeSpecName "kube-api-access-lcf42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.575548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" (UID: "27139ba0-dc60-4e5c-aff2-dc33b8f1ff72"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.589664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "687ff3b2-f773-482b-9bf7-1b6135b4d6ac" (UID: "687ff3b2-f773-482b-9bf7-1b6135b4d6ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.590465 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af002227-deb6-4a24-8ce5-051f93bc178b" (UID: "af002227-deb6-4a24-8ce5-051f93bc178b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.619328 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cad24dd-acc4-40e5-8380-e0d74be79921" (UID: "6cad24dd-acc4-40e5-8380-e0d74be79921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.626075 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcz6w"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648261 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bbh6\" (UniqueName: \"kubernetes.io/projected/6cad24dd-acc4-40e5-8380-e0d74be79921-kube-api-access-7bbh6\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648846 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcf42\" (UniqueName: \"kubernetes.io/projected/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-kube-api-access-lcf42\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648873 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648886 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648898 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af002227-deb6-4a24-8ce5-051f93bc178b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648908 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648917 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648928 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648941 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/687ff3b2-f773-482b-9bf7-1b6135b4d6ac-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648950 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cad24dd-acc4-40e5-8380-e0d74be79921-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648960 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh2lr\" (UniqueName: \"kubernetes.io/projected/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72-kube-api-access-xh2lr\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.648969 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk44x\" (UniqueName: \"kubernetes.io/projected/386f196b-c4bc-4fea-924b-c0487a352310-kube-api-access-vk44x\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.682582 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "386f196b-c4bc-4fea-924b-c0487a352310" (UID: "386f196b-c4bc-4fea-924b-c0487a352310"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.750419 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386f196b-c4bc-4fea-924b-c0487a352310-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788034 4774 generic.go:334] "Generic (PLEG): container finished" podID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerID="5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788134 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerDied","Data":"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788183 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnlbr" event={"ID":"6cad24dd-acc4-40e5-8380-e0d74be79921","Type":"ContainerDied","Data":"9329c497510d489baaa3885ca712f040e06f04f21f51c80edb354ab47b28b6f1"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788177 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnlbr" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.788226 4774 scope.go:117] "RemoveContainer" containerID="5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.796105 4774 generic.go:334] "Generic (PLEG): container finished" podID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerID="d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.796205 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerDied","Data":"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.796254 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds9fx" event={"ID":"687ff3b2-f773-482b-9bf7-1b6135b4d6ac","Type":"ContainerDied","Data":"7c895fbfa4a8556de4f63281aed06489f2f81ec44684ed968eb3615baa9f463f"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.796344 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds9fx" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.803918 4774 generic.go:334] "Generic (PLEG): container finished" podID="af002227-deb6-4a24-8ce5-051f93bc178b" containerID="4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.803990 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerDied","Data":"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.804022 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pj24q" event={"ID":"af002227-deb6-4a24-8ce5-051f93bc178b","Type":"ContainerDied","Data":"f9e3168af24b8a48492cc6713c61bfd5396824e8a18842b10c09f8a37f4eed28"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.804113 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pj24q" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.811399 4774 scope.go:117] "RemoveContainer" containerID="20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.814180 4774 generic.go:334] "Generic (PLEG): container finished" podID="386f196b-c4bc-4fea-924b-c0487a352310" containerID="e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.814286 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerDied","Data":"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.814336 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97k6k" event={"ID":"386f196b-c4bc-4fea-924b-c0487a352310","Type":"ContainerDied","Data":"8a4f7dc871614277fda5710d46e4863b99158a3773f029cd73a7cc991db5ffae"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.814452 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97k6k" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.819085 4774 generic.go:334] "Generic (PLEG): container finished" podID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerID="b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" exitCode=0 Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.819183 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.819295 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" event={"ID":"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72","Type":"ContainerDied","Data":"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.819362 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxtrq" event={"ID":"27139ba0-dc60-4e5c-aff2-dc33b8f1ff72","Type":"ContainerDied","Data":"68dbadf6774272234ad63109a60f8646d78e5ffd2b9aca24a0498e0100dbeb4f"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.823376 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" event={"ID":"fb20bbdd-0a13-4d07-8c4a-8c2285de3173","Type":"ContainerStarted","Data":"ac3374a449cbad41720e0476a29435259f384b34285215bffe05a51f16f1af9b"} Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.825128 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.828279 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bcz6w container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" start-of-body= Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.828350 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" podUID="fb20bbdd-0a13-4d07-8c4a-8c2285de3173" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.836628 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.848167 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lnlbr"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.860501 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.862226 4774 scope.go:117] "RemoveContainer" containerID="063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.864507 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds9fx"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.877682 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.882077 4774 scope.go:117] "RemoveContainer" containerID="5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.883260 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e\": container with ID starting with 5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e not found: ID does not exist" containerID="5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.883289 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e"} err="failed to get container status \"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e\": rpc error: code = NotFound desc = could not find container \"5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e\": container with ID starting with 5a1f08db88b403829bc77243534c043067773356cd477133348ae0a49d190d4e not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.883315 4774 scope.go:117] "RemoveContainer" containerID="20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.883698 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f\": container with ID starting with 20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f not found: ID does not exist" containerID="20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.883722 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f"} err="failed to get container status \"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f\": rpc error: code = NotFound desc = could not find container \"20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f\": container with ID starting with 20c0d772720838488cdf88ec90da0bd56b08ff04f1eabf5f71f15750b194da3f not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.883736 4774 scope.go:117] "RemoveContainer" containerID="063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.884198 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89\": container with ID starting with 063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89 not found: ID does not exist" containerID="063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.884215 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89"} err="failed to get container status \"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89\": rpc error: code = NotFound desc = could not find container \"063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89\": container with ID starting with 063f96bc55e832bf4e0dcbd25627bfcfe651786840886e9045abd441d5ed6f89 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.884227 4774 scope.go:117] "RemoveContainer" containerID="d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.898803 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pj24q"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.899416 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" podStartSLOduration=1.8994039489999999 podStartE2EDuration="1.899403949s" podCreationTimestamp="2026-01-27 00:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:13:20.898364251 +0000 UTC m=+379.204141175" watchObservedRunningTime="2026-01-27 00:13:20.899403949 +0000 UTC m=+379.205180833" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.906315 4774 scope.go:117] "RemoveContainer" containerID="b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.916040 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.920653 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-97k6k"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.924042 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.933563 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxtrq"] Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.933679 4774 scope.go:117] "RemoveContainer" containerID="59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.950709 4774 scope.go:117] "RemoveContainer" containerID="d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.951137 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f\": container with ID starting with d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f not found: ID does not exist" containerID="d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.951166 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f"} err="failed to get container status \"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f\": rpc error: code = NotFound desc = could not find container \"d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f\": container with ID starting with d037aa53fac138e2dad0f640534ad6edf9232ebff78b8a18de57fcd348b9210f not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.951192 4774 scope.go:117] "RemoveContainer" containerID="b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.951563 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724\": container with ID starting with b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724 not found: ID does not exist" containerID="b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.951588 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724"} err="failed to get container status \"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724\": rpc error: code = NotFound desc = could not find container \"b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724\": container with ID starting with b54caaa01e4f02bf17bfbf76728f74afcbc976cee9b3641ce0b432a84e335724 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.951602 4774 scope.go:117] "RemoveContainer" containerID="59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.952935 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532\": container with ID starting with 59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532 not found: ID does not exist" containerID="59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.952961 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532"} err="failed to get container status \"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532\": rpc error: code = NotFound desc = could not find container \"59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532\": container with ID starting with 59bee210401cb835efc15e8d8bad13c2a6d772f8d03e4ab9da87e6884e309532 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.952973 4774 scope.go:117] "RemoveContainer" containerID="4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.967976 4774 scope.go:117] "RemoveContainer" containerID="b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.983324 4774 scope.go:117] "RemoveContainer" containerID="817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.998226 4774 scope.go:117] "RemoveContainer" containerID="4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.999229 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4\": container with ID starting with 4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4 not found: ID does not exist" containerID="4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999260 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4"} err="failed to get container status \"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4\": rpc error: code = NotFound desc = could not find container \"4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4\": container with ID starting with 4d1eb64816b2ee9c799c844199b29d6df5524a7e0aa222950acdff02e773a8b4 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999288 4774 scope.go:117] "RemoveContainer" containerID="b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.999589 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0\": container with ID starting with b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0 not found: ID does not exist" containerID="b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999609 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0"} err="failed to get container status \"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0\": rpc error: code = NotFound desc = could not find container \"b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0\": container with ID starting with b528c71fa07b622b5bf826e7967d6bb2e87545ab73ee6627266da1c66eb071a0 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999621 4774 scope.go:117] "RemoveContainer" containerID="817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3" Jan 27 00:13:20 crc kubenswrapper[4774]: E0127 00:13:20.999812 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3\": container with ID starting with 817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3 not found: ID does not exist" containerID="817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999835 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3"} err="failed to get container status \"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3\": rpc error: code = NotFound desc = could not find container \"817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3\": container with ID starting with 817827149fcc362e14c5e0125294117a28dafa291ceb8f657c9afdf9a5be67b3 not found: ID does not exist" Jan 27 00:13:20 crc kubenswrapper[4774]: I0127 00:13:20.999849 4774 scope.go:117] "RemoveContainer" containerID="e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.016535 4774 scope.go:117] "RemoveContainer" containerID="d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.039644 4774 scope.go:117] "RemoveContainer" containerID="783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.065255 4774 scope.go:117] "RemoveContainer" containerID="e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.066784 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b\": container with ID starting with e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b not found: ID does not exist" containerID="e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.066882 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b"} err="failed to get container status \"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b\": rpc error: code = NotFound desc = could not find container \"e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b\": container with ID starting with e5c501a4ff20c110abc81090ab5d3193a80ea7505aaa4d917cdcb347614c248b not found: ID does not exist" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.066934 4774 scope.go:117] "RemoveContainer" containerID="d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.067675 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c\": container with ID starting with d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c not found: ID does not exist" containerID="d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.067714 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c"} err="failed to get container status \"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c\": rpc error: code = NotFound desc = could not find container \"d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c\": container with ID starting with d8842ec57342ada60f36512aba2eb835210b096ca818f5ceaa5d5ff124d8c13c not found: ID does not exist" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.067739 4774 scope.go:117] "RemoveContainer" containerID="783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.068510 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9\": container with ID starting with 783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9 not found: ID does not exist" containerID="783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.068573 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9"} err="failed to get container status \"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9\": rpc error: code = NotFound desc = could not find container \"783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9\": container with ID starting with 783ba18a2ff413f64bf2db46a2024df3e298876e68e56c39f548f274e372a2b9 not found: ID does not exist" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.068635 4774 scope.go:117] "RemoveContainer" containerID="b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.115918 4774 scope.go:117] "RemoveContainer" containerID="b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.116696 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5\": container with ID starting with b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5 not found: ID does not exist" containerID="b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.116776 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5"} err="failed to get container status \"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5\": rpc error: code = NotFound desc = could not find container \"b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5\": container with ID starting with b8f35f19402705e2e3b8740c20af72d365237f8ce6fdd0d058006cb46578fcf5 not found: ID does not exist" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518099 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7bhhh"] Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518359 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518375 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518385 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518394 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518405 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518414 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518426 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518434 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518446 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518456 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518470 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518479 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518491 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518499 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518514 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518522 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518531 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518538 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518548 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518555 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518568 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518576 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518586 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518594 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="extract-utilities" Jan 27 00:13:21 crc kubenswrapper[4774]: E0127 00:13:21.518605 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518613 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="extract-content" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518738 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="386f196b-c4bc-4fea-924b-c0487a352310" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518753 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518767 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518780 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" containerName="registry-server" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.518790 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" containerName="marketplace-operator" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.519740 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.522434 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.535234 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bhhh"] Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.560257 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-utilities\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.560432 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrssf\" (UniqueName: \"kubernetes.io/projected/0b4a869e-2e74-4226-b41c-c8a481a0728b-kube-api-access-qrssf\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.560484 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-catalog-content\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.662206 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrssf\" (UniqueName: \"kubernetes.io/projected/0b4a869e-2e74-4226-b41c-c8a481a0728b-kube-api-access-qrssf\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.662329 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-catalog-content\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.662421 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-utilities\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.663491 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-catalog-content\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.663555 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b4a869e-2e74-4226-b41c-c8a481a0728b-utilities\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.688029 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrssf\" (UniqueName: \"kubernetes.io/projected/0b4a869e-2e74-4226-b41c-c8a481a0728b-kube-api-access-qrssf\") pod \"certified-operators-7bhhh\" (UID: \"0b4a869e-2e74-4226-b41c-c8a481a0728b\") " pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.834202 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" event={"ID":"fb20bbdd-0a13-4d07-8c4a-8c2285de3173","Type":"ContainerStarted","Data":"64877aa3b7149ca58578844502bee7101605f38b4348d45d430ea5251007cc20"} Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.851813 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:21 crc kubenswrapper[4774]: I0127 00:13:21.858944 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bcz6w" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.112323 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bhhh"] Jan 27 00:13:22 crc kubenswrapper[4774]: W0127 00:13:22.131382 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b4a869e_2e74_4226_b41c_c8a481a0728b.slice/crio-e02d47eb75a33b43615a15af0012d068b8d01d26136c5f7907e6445560875e8b WatchSource:0}: Error finding container e02d47eb75a33b43615a15af0012d068b8d01d26136c5f7907e6445560875e8b: Status 404 returned error can't find the container with id e02d47eb75a33b43615a15af0012d068b8d01d26136c5f7907e6445560875e8b Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.363587 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27139ba0-dc60-4e5c-aff2-dc33b8f1ff72" path="/var/lib/kubelet/pods/27139ba0-dc60-4e5c-aff2-dc33b8f1ff72/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.364534 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386f196b-c4bc-4fea-924b-c0487a352310" path="/var/lib/kubelet/pods/386f196b-c4bc-4fea-924b-c0487a352310/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.365169 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="687ff3b2-f773-482b-9bf7-1b6135b4d6ac" path="/var/lib/kubelet/pods/687ff3b2-f773-482b-9bf7-1b6135b4d6ac/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.366386 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cad24dd-acc4-40e5-8380-e0d74be79921" path="/var/lib/kubelet/pods/6cad24dd-acc4-40e5-8380-e0d74be79921/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.367085 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af002227-deb6-4a24-8ce5-051f93bc178b" path="/var/lib/kubelet/pods/af002227-deb6-4a24-8ce5-051f93bc178b/volumes" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.887612 4774 generic.go:334] "Generic (PLEG): container finished" podID="0b4a869e-2e74-4226-b41c-c8a481a0728b" containerID="8b8aeff7b3ebdbaf9b163700840cc3ddd2df567ed32b3e95708874387def64dc" exitCode=0 Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.887687 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerDied","Data":"8b8aeff7b3ebdbaf9b163700840cc3ddd2df567ed32b3e95708874387def64dc"} Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.887774 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerStarted","Data":"e02d47eb75a33b43615a15af0012d068b8d01d26136c5f7907e6445560875e8b"} Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.921876 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.923132 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.929586 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.938465 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.985998 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.986089 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:22 crc kubenswrapper[4774]: I0127 00:13:22.986463 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.087790 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.087900 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.087946 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.088429 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.088472 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.114501 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") pod \"redhat-marketplace-gb9l9\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.248484 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.693214 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.897369 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerStarted","Data":"2c6dd8878a47022f56e27b5201cf3e35ff53f86100131049fb6f5a97d01e4b04"} Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.900136 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerStarted","Data":"51eaded7e7d2f079a111fb785033a43f6ec05414382be88a1e655bf429602842"} Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.900211 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerStarted","Data":"bc02ff041620d6abe4f9549d299ee2ebf620ffad6df25e35936cc68f50fc4bcd"} Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.931195 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tg7j5"] Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.932503 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.934536 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 00:13:23 crc kubenswrapper[4774]: I0127 00:13:23.948149 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tg7j5"] Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.004827 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-utilities\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.012422 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcm9\" (UniqueName: \"kubernetes.io/projected/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-kube-api-access-sfcm9\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.012608 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-catalog-content\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.113506 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-utilities\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.113578 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcm9\" (UniqueName: \"kubernetes.io/projected/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-kube-api-access-sfcm9\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.113656 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-catalog-content\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.114256 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-catalog-content\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.114416 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-utilities\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.134124 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcm9\" (UniqueName: \"kubernetes.io/projected/8be8b5b9-e3bc-4236-90ca-3d3808fa39b4-kube-api-access-sfcm9\") pod \"redhat-operators-tg7j5\" (UID: \"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4\") " pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.290660 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.531234 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tg7j5"] Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.911736 4774 generic.go:334] "Generic (PLEG): container finished" podID="0b4a869e-2e74-4226-b41c-c8a481a0728b" containerID="2c6dd8878a47022f56e27b5201cf3e35ff53f86100131049fb6f5a97d01e4b04" exitCode=0 Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.911933 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerDied","Data":"2c6dd8878a47022f56e27b5201cf3e35ff53f86100131049fb6f5a97d01e4b04"} Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.915517 4774 generic.go:334] "Generic (PLEG): container finished" podID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerID="51eaded7e7d2f079a111fb785033a43f6ec05414382be88a1e655bf429602842" exitCode=0 Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.915723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerDied","Data":"51eaded7e7d2f079a111fb785033a43f6ec05414382be88a1e655bf429602842"} Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.917455 4774 generic.go:334] "Generic (PLEG): container finished" podID="8be8b5b9-e3bc-4236-90ca-3d3808fa39b4" containerID="70e5425f9e3c516bf5c6c0404244081e7c1f4b91fe2e253905d74ac101fc43ee" exitCode=0 Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.917514 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerDied","Data":"70e5425f9e3c516bf5c6c0404244081e7c1f4b91fe2e253905d74ac101fc43ee"} Jan 27 00:13:24 crc kubenswrapper[4774]: I0127 00:13:24.917560 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerStarted","Data":"124200faa1093ffc5f82f012ae8756c1b9cb9df7d6a711ebf34b6c3c8643dfed"} Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.336538 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6krcw"] Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.337698 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.340803 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.345554 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6krcw"] Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.463527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-catalog-content\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.463981 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf96c\" (UniqueName: \"kubernetes.io/projected/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-kube-api-access-kf96c\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.464073 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-utilities\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.565957 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-utilities\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.566067 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-catalog-content\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.566118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf96c\" (UniqueName: \"kubernetes.io/projected/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-kube-api-access-kf96c\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.566560 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-catalog-content\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.566592 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-utilities\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.585803 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf96c\" (UniqueName: \"kubernetes.io/projected/89dc2f6e-3f9e-4098-b5d4-ff9481de0824-kube-api-access-kf96c\") pod \"community-operators-6krcw\" (UID: \"89dc2f6e-3f9e-4098-b5d4-ff9481de0824\") " pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.669332 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.926150 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bhhh" event={"ID":"0b4a869e-2e74-4226-b41c-c8a481a0728b","Type":"ContainerStarted","Data":"345b5c0316ebee074c748f3ef5140c0f4c008f93c214eaca618daf79b3106144"} Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.929640 4774 generic.go:334] "Generic (PLEG): container finished" podID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerID="20d5f60ef29061380fe8995d910d907f50edc601d69e30e44c129e980a103c38" exitCode=0 Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.929705 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerDied","Data":"20d5f60ef29061380fe8995d910d907f50edc601d69e30e44c129e980a103c38"} Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.932302 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerStarted","Data":"33b37e04507546c38a3ecd1d6b7da9fa51f5f561ac6704c2db49282fbeb29bfe"} Jan 27 00:13:25 crc kubenswrapper[4774]: I0127 00:13:25.943427 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7bhhh" podStartSLOduration=2.472101325 podStartE2EDuration="4.943407194s" podCreationTimestamp="2026-01-27 00:13:21 +0000 UTC" firstStartedPulling="2026-01-27 00:13:22.88987437 +0000 UTC m=+381.195651254" lastFinishedPulling="2026-01-27 00:13:25.361180229 +0000 UTC m=+383.666957123" observedRunningTime="2026-01-27 00:13:25.942160441 +0000 UTC m=+384.247937315" watchObservedRunningTime="2026-01-27 00:13:25.943407194 +0000 UTC m=+384.249184068" Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.137122 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6krcw"] Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.943195 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerStarted","Data":"ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f"} Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.946548 4774 generic.go:334] "Generic (PLEG): container finished" podID="8be8b5b9-e3bc-4236-90ca-3d3808fa39b4" containerID="33b37e04507546c38a3ecd1d6b7da9fa51f5f561ac6704c2db49282fbeb29bfe" exitCode=0 Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.946622 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerDied","Data":"33b37e04507546c38a3ecd1d6b7da9fa51f5f561ac6704c2db49282fbeb29bfe"} Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.954700 4774 generic.go:334] "Generic (PLEG): container finished" podID="89dc2f6e-3f9e-4098-b5d4-ff9481de0824" containerID="14505b82dea48b6bf2692bc6368a4affbcd972462cba6ce0c5188462a9bf1e59" exitCode=0 Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.954786 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6krcw" event={"ID":"89dc2f6e-3f9e-4098-b5d4-ff9481de0824","Type":"ContainerDied","Data":"14505b82dea48b6bf2692bc6368a4affbcd972462cba6ce0c5188462a9bf1e59"} Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.954830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6krcw" event={"ID":"89dc2f6e-3f9e-4098-b5d4-ff9481de0824","Type":"ContainerStarted","Data":"107deb40c13230bebd0e789c080095686cbb51a5317eb7f65bc823e890ddc4f2"} Jan 27 00:13:26 crc kubenswrapper[4774]: I0127 00:13:26.972073 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gb9l9" podStartSLOduration=3.521119382 podStartE2EDuration="4.972050868s" podCreationTimestamp="2026-01-27 00:13:22 +0000 UTC" firstStartedPulling="2026-01-27 00:13:24.917911245 +0000 UTC m=+383.223688129" lastFinishedPulling="2026-01-27 00:13:26.368842731 +0000 UTC m=+384.674619615" observedRunningTime="2026-01-27 00:13:26.968070031 +0000 UTC m=+385.273846925" watchObservedRunningTime="2026-01-27 00:13:26.972050868 +0000 UTC m=+385.277827772" Jan 27 00:13:27 crc kubenswrapper[4774]: I0127 00:13:27.962446 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg7j5" event={"ID":"8be8b5b9-e3bc-4236-90ca-3d3808fa39b4","Type":"ContainerStarted","Data":"d0457685baa12dca44afe111dbf293b4a69ba1ffd872aa78e2ccee3ef0295d36"} Jan 27 00:13:27 crc kubenswrapper[4774]: I0127 00:13:27.967707 4774 generic.go:334] "Generic (PLEG): container finished" podID="89dc2f6e-3f9e-4098-b5d4-ff9481de0824" containerID="ab05f1624e10cf0666152c3a1a0f5ca887717e8fd98871d618caac5de240eaaf" exitCode=0 Jan 27 00:13:27 crc kubenswrapper[4774]: I0127 00:13:27.967823 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6krcw" event={"ID":"89dc2f6e-3f9e-4098-b5d4-ff9481de0824","Type":"ContainerDied","Data":"ab05f1624e10cf0666152c3a1a0f5ca887717e8fd98871d618caac5de240eaaf"} Jan 27 00:13:28 crc kubenswrapper[4774]: I0127 00:13:28.008754 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tg7j5" podStartSLOduration=2.558345416 podStartE2EDuration="5.008729155s" podCreationTimestamp="2026-01-27 00:13:23 +0000 UTC" firstStartedPulling="2026-01-27 00:13:24.921173342 +0000 UTC m=+383.226950226" lastFinishedPulling="2026-01-27 00:13:27.371557041 +0000 UTC m=+385.677333965" observedRunningTime="2026-01-27 00:13:27.987795396 +0000 UTC m=+386.293572280" watchObservedRunningTime="2026-01-27 00:13:28.008729155 +0000 UTC m=+386.314506039" Jan 27 00:13:28 crc kubenswrapper[4774]: I0127 00:13:28.977104 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6krcw" event={"ID":"89dc2f6e-3f9e-4098-b5d4-ff9481de0824","Type":"ContainerStarted","Data":"8aec29af7c4c2a86d9c2a6d7defd4b2644095488c482807de8e6b8d35bedcd14"} Jan 27 00:13:28 crc kubenswrapper[4774]: I0127 00:13:28.998041 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6krcw" podStartSLOduration=2.566818877 podStartE2EDuration="3.998021387s" podCreationTimestamp="2026-01-27 00:13:25 +0000 UTC" firstStartedPulling="2026-01-27 00:13:26.957330494 +0000 UTC m=+385.263107378" lastFinishedPulling="2026-01-27 00:13:28.388533004 +0000 UTC m=+386.694309888" observedRunningTime="2026-01-27 00:13:28.993939138 +0000 UTC m=+387.299716022" watchObservedRunningTime="2026-01-27 00:13:28.998021387 +0000 UTC m=+387.303798281" Jan 27 00:13:31 crc kubenswrapper[4774]: I0127 00:13:31.853165 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:31 crc kubenswrapper[4774]: I0127 00:13:31.854037 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:31 crc kubenswrapper[4774]: I0127 00:13:31.909297 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:32 crc kubenswrapper[4774]: I0127 00:13:32.037579 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7bhhh" Jan 27 00:13:33 crc kubenswrapper[4774]: I0127 00:13:33.249471 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:33 crc kubenswrapper[4774]: I0127 00:13:33.250084 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:33 crc kubenswrapper[4774]: I0127 00:13:33.306146 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:34 crc kubenswrapper[4774]: I0127 00:13:34.055143 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:13:34 crc kubenswrapper[4774]: I0127 00:13:34.291111 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:34 crc kubenswrapper[4774]: I0127 00:13:34.291342 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:34 crc kubenswrapper[4774]: I0127 00:13:34.341066 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:35 crc kubenswrapper[4774]: I0127 00:13:35.349642 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tg7j5" Jan 27 00:13:35 crc kubenswrapper[4774]: I0127 00:13:35.670198 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:35 crc kubenswrapper[4774]: I0127 00:13:35.670810 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:35 crc kubenswrapper[4774]: I0127 00:13:35.719067 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.346750 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6krcw" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.675446 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.675512 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.675563 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.676196 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:13:36 crc kubenswrapper[4774]: I0127 00:13:36.676258 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5" gracePeriod=600 Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.310501 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5" exitCode=0 Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.310683 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5"} Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.311580 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d"} Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.311612 4774 scope.go:117] "RemoveContainer" containerID="18112bd6234b3c5a264548f1d9b3c469554042c2c2a543a91237512eada2ae85" Jan 27 00:13:37 crc kubenswrapper[4774]: I0127 00:13:37.612478 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" podUID="df626623-28b8-43a3-a567-f14b1e95075a" containerName="registry" containerID="cri-o://3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387" gracePeriod=30 Jan 27 00:13:39 crc kubenswrapper[4774]: I0127 00:13:39.328011 4774 generic.go:334] "Generic (PLEG): container finished" podID="df626623-28b8-43a3-a567-f14b1e95075a" containerID="3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387" exitCode=0 Jan 27 00:13:39 crc kubenswrapper[4774]: I0127 00:13:39.328090 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" event={"ID":"df626623-28b8-43a3-a567-f14b1e95075a","Type":"ContainerDied","Data":"3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387"} Jan 27 00:13:39 crc kubenswrapper[4774]: I0127 00:13:39.945109 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060021 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060105 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060377 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060413 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060493 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060523 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060568 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.060611 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") pod \"df626623-28b8-43a3-a567-f14b1e95075a\" (UID: \"df626623-28b8-43a3-a567-f14b1e95075a\") " Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.061453 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.067166 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.070444 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.070881 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.071061 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g" (OuterVolumeSpecName: "kube-api-access-sj86g") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "kube-api-access-sj86g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.082338 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.090835 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.106770 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "df626623-28b8-43a3-a567-f14b1e95075a" (UID: "df626623-28b8-43a3-a567-f14b1e95075a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162498 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj86g\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-kube-api-access-sj86g\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162536 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162546 4774 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/df626623-28b8-43a3-a567-f14b1e95075a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162554 4774 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162562 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/df626623-28b8-43a3-a567-f14b1e95075a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162571 4774 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/df626623-28b8-43a3-a567-f14b1e95075a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.162578 4774 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/df626623-28b8-43a3-a567-f14b1e95075a-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.335545 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" event={"ID":"df626623-28b8-43a3-a567-f14b1e95075a","Type":"ContainerDied","Data":"8178e110af2bc09c51dee988ba34c5369cbe128f1c84c871cab3eebbd6fb92bf"} Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.335609 4774 scope.go:117] "RemoveContainer" containerID="3d1d5a1b2275a0bf584d707ccddcf233f3105c423230ffd6922c612599ca8387" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.335720 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zgbtz" Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.377139 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:13:40 crc kubenswrapper[4774]: I0127 00:13:40.383792 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zgbtz"] Jan 27 00:13:42 crc kubenswrapper[4774]: I0127 00:13:42.364282 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df626623-28b8-43a3-a567-f14b1e95075a" path="/var/lib/kubelet/pods/df626623-28b8-43a3-a567-f14b1e95075a/volumes" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.180343 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn"] Jan 27 00:15:00 crc kubenswrapper[4774]: E0127 00:15:00.184038 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df626623-28b8-43a3-a567-f14b1e95075a" containerName="registry" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.184274 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="df626623-28b8-43a3-a567-f14b1e95075a" containerName="registry" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.184682 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="df626623-28b8-43a3-a567-f14b1e95075a" containerName="registry" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.185748 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.188023 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.191044 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.195083 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn"] Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.294264 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.294346 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.294453 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.396056 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.396315 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.396372 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.397544 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.409139 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.414398 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") pod \"collect-profiles-29491215-64hnn\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.509044 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.718568 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn"] Jan 27 00:15:00 crc kubenswrapper[4774]: I0127 00:15:00.831826 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" event={"ID":"384f6205-7bf2-47e1-909f-def30825754d","Type":"ContainerStarted","Data":"f107b6f277f3dc55072c6f38ed6703d35cd4fd5c24142d3d9e413372e36a70a2"} Jan 27 00:15:01 crc kubenswrapper[4774]: I0127 00:15:01.839077 4774 generic.go:334] "Generic (PLEG): container finished" podID="384f6205-7bf2-47e1-909f-def30825754d" containerID="904c5d3ff81fc19a34066ba9aa7e7fa056cbdc1458b4158be4afb1cf86947ed7" exitCode=0 Jan 27 00:15:01 crc kubenswrapper[4774]: I0127 00:15:01.839140 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" event={"ID":"384f6205-7bf2-47e1-909f-def30825754d","Type":"ContainerDied","Data":"904c5d3ff81fc19a34066ba9aa7e7fa056cbdc1458b4158be4afb1cf86947ed7"} Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.058919 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.229678 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") pod \"384f6205-7bf2-47e1-909f-def30825754d\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.230979 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") pod \"384f6205-7bf2-47e1-909f-def30825754d\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.231071 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") pod \"384f6205-7bf2-47e1-909f-def30825754d\" (UID: \"384f6205-7bf2-47e1-909f-def30825754d\") " Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.231546 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume" (OuterVolumeSpecName: "config-volume") pod "384f6205-7bf2-47e1-909f-def30825754d" (UID: "384f6205-7bf2-47e1-909f-def30825754d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.231772 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/384f6205-7bf2-47e1-909f-def30825754d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.236194 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv" (OuterVolumeSpecName: "kube-api-access-nsgxv") pod "384f6205-7bf2-47e1-909f-def30825754d" (UID: "384f6205-7bf2-47e1-909f-def30825754d"). InnerVolumeSpecName "kube-api-access-nsgxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.237063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "384f6205-7bf2-47e1-909f-def30825754d" (UID: "384f6205-7bf2-47e1-909f-def30825754d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.332294 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/384f6205-7bf2-47e1-909f-def30825754d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.332324 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsgxv\" (UniqueName: \"kubernetes.io/projected/384f6205-7bf2-47e1-909f-def30825754d-kube-api-access-nsgxv\") on node \"crc\" DevicePath \"\"" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.850599 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" event={"ID":"384f6205-7bf2-47e1-909f-def30825754d","Type":"ContainerDied","Data":"f107b6f277f3dc55072c6f38ed6703d35cd4fd5c24142d3d9e413372e36a70a2"} Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.850896 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f107b6f277f3dc55072c6f38ed6703d35cd4fd5c24142d3d9e413372e36a70a2" Jan 27 00:15:03 crc kubenswrapper[4774]: I0127 00:15:03.850924 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491215-64hnn" Jan 27 00:16:06 crc kubenswrapper[4774]: I0127 00:16:06.674978 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:16:06 crc kubenswrapper[4774]: I0127 00:16:06.675600 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:16:36 crc kubenswrapper[4774]: I0127 00:16:36.675843 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:16:36 crc kubenswrapper[4774]: I0127 00:16:36.676461 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.675697 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.676225 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.676269 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.676794 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:17:06 crc kubenswrapper[4774]: I0127 00:17:06.676848 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d" gracePeriod=600 Jan 27 00:17:07 crc kubenswrapper[4774]: I0127 00:17:07.617027 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d" exitCode=0 Jan 27 00:17:07 crc kubenswrapper[4774]: I0127 00:17:07.617169 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d"} Jan 27 00:17:07 crc kubenswrapper[4774]: I0127 00:17:07.617465 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442"} Jan 27 00:17:07 crc kubenswrapper[4774]: I0127 00:17:07.617495 4774 scope.go:117] "RemoveContainer" containerID="447c321d916e303176c9279a924cd06866f1f990692f85480dc8efad70b988f5" Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.874546 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5rgv"] Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.875792 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-controller" containerID="cri-o://21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.875921 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="nbdb" containerID="cri-o://33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876070 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="northd" containerID="cri-o://4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876169 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876257 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-acl-logging" containerID="cri-o://be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876340 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="sbdb" containerID="cri-o://a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.876318 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-node" containerID="cri-o://c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" gracePeriod=30 Jan 27 00:18:20 crc kubenswrapper[4774]: I0127 00:18:20.937055 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" containerID="cri-o://86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" gracePeriod=30 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.086176 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovnkube-controller/3.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.088780 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-acl-logging/0.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.089436 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-controller/0.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.089956 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" exitCode=0 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.089990 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" exitCode=0 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.089997 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" exitCode=0 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090018 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" exitCode=143 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090026 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" exitCode=143 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090037 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090102 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090118 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090130 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090144 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.090180 4774 scope.go:117] "RemoveContainer" containerID="549d7d6cc7feaea08014a551f33f132a2b4c3737125b473d4804e7e8af373f89" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.092759 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/2.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.093439 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/1.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.093489 4774 generic.go:334] "Generic (PLEG): container finished" podID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" containerID="b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c" exitCode=2 Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.093525 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerDied","Data":"b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c"} Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.094045 4774 scope.go:117] "RemoveContainer" containerID="b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.094327 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mtz9l_openshift-multus(0abcf78e-9b05-4b89-94f3-4d3230886ce0)\"" pod="openshift-multus/multus-mtz9l" podUID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.148725 4774 scope.go:117] "RemoveContainer" containerID="01f2ab3254eeace7c038c8f9cb279bb5cac1de95aba311b46c5215b3aab59972" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.242616 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-acl-logging/0.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.243147 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-controller/0.log" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.243902 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309054 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hxmlk"] Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309557 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-node" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309582 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-node" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309602 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309632 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309646 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="nbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309654 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="nbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309664 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309672 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309753 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="northd" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309764 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="northd" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309806 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309816 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309828 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kubecfg-setup" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309836 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kubecfg-setup" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309892 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-acl-logging" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309901 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-acl-logging" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309915 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="sbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309923 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="sbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309935 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384f6205-7bf2-47e1-909f-def30825754d" containerName="collect-profiles" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309943 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="384f6205-7bf2-47e1-909f-def30825754d" containerName="collect-profiles" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.309984 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.309992 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.310006 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310015 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310188 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="northd" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310226 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310241 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310251 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310262 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310292 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310306 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="sbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310315 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovn-acl-logging" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310327 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="nbdb" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310340 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310389 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="kube-rbac-proxy-node" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310402 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="384f6205-7bf2-47e1-909f-def30825754d" containerName="collect-profiles" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.310568 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310578 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310796 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: E0127 00:18:21.310986 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.310997 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerName="ovnkube-controller" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.313714 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420031 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420114 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420140 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420177 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420205 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420243 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420275 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420300 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420323 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420348 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420379 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420424 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420449 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420477 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420533 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420564 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420589 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420625 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420644 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") pod \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\" (UID: \"db881c9d-a960-48ae-93bf-d0ccd687e0b9\") " Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420888 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-etc-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420924 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-bin\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420950 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.420976 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-netd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421000 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba74c484-716d-469a-9ac1-299eb234026a-ovn-node-metrics-cert\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421030 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-script-lib\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421054 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-log-socket\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421077 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-netns\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421098 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-slash\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421128 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-systemd-units\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421149 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-var-lib-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421171 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-ovn\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421194 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-kubelet\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421222 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-env-overrides\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421249 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smbvh\" (UniqueName: \"kubernetes.io/projected/ba74c484-716d-469a-9ac1-299eb234026a-kube-api-access-smbvh\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421287 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-node-log\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421320 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421327 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421350 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421453 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash" (OuterVolumeSpecName: "host-slash") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421472 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-systemd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-config\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421657 4774 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.421677 4774 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-slash\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422767 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422829 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422896 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422925 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log" (OuterVolumeSpecName: "node-log") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422918 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.422996 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423030 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423672 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket" (OuterVolumeSpecName: "log-socket") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423773 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.423807 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.425293 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.425346 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.425630 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.429612 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km" (OuterVolumeSpecName: "kube-api-access-l92km") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "kube-api-access-l92km". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.431334 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.447823 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "db881c9d-a960-48ae-93bf-d0ccd687e0b9" (UID: "db881c9d-a960-48ae-93bf-d0ccd687e0b9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523608 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-node-log\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523721 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523759 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-node-log\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523803 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523850 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.523897 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-systemd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524026 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-config\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524052 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-systemd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524073 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-ovn-kubernetes\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524204 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-etc-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524101 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-etc-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524348 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-bin\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524373 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524396 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-netd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba74c484-716d-469a-9ac1-299eb234026a-ovn-node-metrics-cert\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524480 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-script-lib\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524502 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-log-socket\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524525 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-netns\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524533 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-bin\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524565 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-slash\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524567 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-cni-netd\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524546 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-slash\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524638 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-run-netns\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524687 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-log-socket\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524787 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-systemd-units\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524838 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-var-lib-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524920 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-ovn\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.524998 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-kubelet\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525043 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-config\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525057 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-env-overrides\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525128 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525197 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-run-ovn\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525513 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-systemd-units\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525557 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-host-kubelet\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525581 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ba74c484-716d-469a-9ac1-299eb234026a-var-lib-openvswitch\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525725 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smbvh\" (UniqueName: \"kubernetes.io/projected/ba74c484-716d-469a-9ac1-299eb234026a-kube-api-access-smbvh\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.525788 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-ovnkube-script-lib\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526072 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526108 4774 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526127 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526145 4774 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526162 4774 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526181 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526199 4774 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526216 4774 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-log-socket\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526233 4774 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526250 4774 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526267 4774 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526283 4774 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-node-log\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526300 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db881c9d-a960-48ae-93bf-d0ccd687e0b9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526319 4774 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526323 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ba74c484-716d-469a-9ac1-299eb234026a-env-overrides\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526335 4774 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526414 4774 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526437 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l92km\" (UniqueName: \"kubernetes.io/projected/db881c9d-a960-48ae-93bf-d0ccd687e0b9-kube-api-access-l92km\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.526456 4774 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db881c9d-a960-48ae-93bf-d0ccd687e0b9-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.528475 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba74c484-716d-469a-9ac1-299eb234026a-ovn-node-metrics-cert\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.558321 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smbvh\" (UniqueName: \"kubernetes.io/projected/ba74c484-716d-469a-9ac1-299eb234026a-kube-api-access-smbvh\") pod \"ovnkube-node-hxmlk\" (UID: \"ba74c484-716d-469a-9ac1-299eb234026a\") " pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:21 crc kubenswrapper[4774]: I0127 00:18:21.631153 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.104196 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba74c484-716d-469a-9ac1-299eb234026a" containerID="c78f160701dec7e9fe4b1c576bee0236f75bffca60ad65c0838e3abae4adc208" exitCode=0 Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.104283 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerDied","Data":"c78f160701dec7e9fe4b1c576bee0236f75bffca60ad65c0838e3abae4adc208"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.104387 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"4d24f470b2834422d5b2e205a7ac192b100ed1a914ccb1a11e897bdb71b7b1f8"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.110710 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-acl-logging/0.log" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.112498 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l5rgv_db881c9d-a960-48ae-93bf-d0ccd687e0b9/ovn-controller/0.log" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113615 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" exitCode=0 Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113649 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" exitCode=0 Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113667 4774 generic.go:334] "Generic (PLEG): container finished" podID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" exitCode=0 Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113750 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113772 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113937 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.114143 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.114240 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5rgv" event={"ID":"db881c9d-a960-48ae-93bf-d0ccd687e0b9","Type":"ContainerDied","Data":"0b00b9fd4feb292e024108c60b4d42083f61c51695221c829f872f112bca3d83"} Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.113982 4774 scope.go:117] "RemoveContainer" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.123836 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/2.log" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.147062 4774 scope.go:117] "RemoveContainer" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.177698 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5rgv"] Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.185215 4774 scope.go:117] "RemoveContainer" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.200290 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5rgv"] Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.236044 4774 scope.go:117] "RemoveContainer" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.258934 4774 scope.go:117] "RemoveContainer" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.274473 4774 scope.go:117] "RemoveContainer" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.291224 4774 scope.go:117] "RemoveContainer" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.312452 4774 scope.go:117] "RemoveContainer" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.329352 4774 scope.go:117] "RemoveContainer" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.354327 4774 scope.go:117] "RemoveContainer" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.354944 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": container with ID starting with 86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c not found: ID does not exist" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.355007 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} err="failed to get container status \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": rpc error: code = NotFound desc = could not find container \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": container with ID starting with 86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.355052 4774 scope.go:117] "RemoveContainer" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.356486 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": container with ID starting with a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886 not found: ID does not exist" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.356535 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} err="failed to get container status \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": rpc error: code = NotFound desc = could not find container \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": container with ID starting with a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.356571 4774 scope.go:117] "RemoveContainer" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.356939 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": container with ID starting with 33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907 not found: ID does not exist" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.356958 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} err="failed to get container status \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": rpc error: code = NotFound desc = could not find container \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": container with ID starting with 33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357029 4774 scope.go:117] "RemoveContainer" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.357372 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": container with ID starting with 4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62 not found: ID does not exist" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357397 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} err="failed to get container status \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": rpc error: code = NotFound desc = could not find container \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": container with ID starting with 4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357411 4774 scope.go:117] "RemoveContainer" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.357701 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": container with ID starting with 18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f not found: ID does not exist" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357751 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} err="failed to get container status \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": rpc error: code = NotFound desc = could not find container \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": container with ID starting with 18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.357780 4774 scope.go:117] "RemoveContainer" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.358076 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": container with ID starting with c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d not found: ID does not exist" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358100 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} err="failed to get container status \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": rpc error: code = NotFound desc = could not find container \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": container with ID starting with c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358113 4774 scope.go:117] "RemoveContainer" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.358367 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": container with ID starting with be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4 not found: ID does not exist" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358394 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} err="failed to get container status \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": rpc error: code = NotFound desc = could not find container \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": container with ID starting with be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358407 4774 scope.go:117] "RemoveContainer" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.358682 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": container with ID starting with 21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37 not found: ID does not exist" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358706 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} err="failed to get container status \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": rpc error: code = NotFound desc = could not find container \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": container with ID starting with 21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.358720 4774 scope.go:117] "RemoveContainer" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: E0127 00:18:22.359052 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": container with ID starting with b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135 not found: ID does not exist" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.359077 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135"} err="failed to get container status \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": rpc error: code = NotFound desc = could not find container \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": container with ID starting with b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.359093 4774 scope.go:117] "RemoveContainer" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.359448 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} err="failed to get container status \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": rpc error: code = NotFound desc = could not find container \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": container with ID starting with 86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.359550 4774 scope.go:117] "RemoveContainer" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.360435 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} err="failed to get container status \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": rpc error: code = NotFound desc = could not find container \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": container with ID starting with a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.360459 4774 scope.go:117] "RemoveContainer" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.360824 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} err="failed to get container status \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": rpc error: code = NotFound desc = could not find container \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": container with ID starting with 33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.360839 4774 scope.go:117] "RemoveContainer" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.361184 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} err="failed to get container status \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": rpc error: code = NotFound desc = could not find container \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": container with ID starting with 4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.361216 4774 scope.go:117] "RemoveContainer" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.361684 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} err="failed to get container status \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": rpc error: code = NotFound desc = could not find container \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": container with ID starting with 18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.361726 4774 scope.go:117] "RemoveContainer" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.362418 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} err="failed to get container status \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": rpc error: code = NotFound desc = could not find container \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": container with ID starting with c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.362443 4774 scope.go:117] "RemoveContainer" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.362774 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} err="failed to get container status \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": rpc error: code = NotFound desc = could not find container \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": container with ID starting with be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.362792 4774 scope.go:117] "RemoveContainer" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.363066 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} err="failed to get container status \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": rpc error: code = NotFound desc = could not find container \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": container with ID starting with 21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.363091 4774 scope.go:117] "RemoveContainer" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364147 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135"} err="failed to get container status \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": rpc error: code = NotFound desc = could not find container \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": container with ID starting with b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364184 4774 scope.go:117] "RemoveContainer" containerID="86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364572 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c"} err="failed to get container status \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": rpc error: code = NotFound desc = could not find container \"86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c\": container with ID starting with 86cb6803a2cd50e12129c9ea6027aa0e160b73eda2c8d3b21d7c616346336c0c not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364599 4774 scope.go:117] "RemoveContainer" containerID="a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364948 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886"} err="failed to get container status \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": rpc error: code = NotFound desc = could not find container \"a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886\": container with ID starting with a9cc58a5ff0f087c19882a38df616aedf4051dbfdc9bf89dc3d9a5b3102a0886 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.364997 4774 scope.go:117] "RemoveContainer" containerID="33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.365279 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907"} err="failed to get container status \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": rpc error: code = NotFound desc = could not find container \"33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907\": container with ID starting with 33ed5ceaf36651afa0019a7575a8809277b2f2779fff9cfedb93e2bf20a3a907 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.365313 4774 scope.go:117] "RemoveContainer" containerID="4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366027 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62"} err="failed to get container status \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": rpc error: code = NotFound desc = could not find container \"4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62\": container with ID starting with 4bee5e1f9ac808425c5e591840f3053edb96dbe9c1ae6ff7666a28ea8ce08d62 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366047 4774 scope.go:117] "RemoveContainer" containerID="18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366321 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f"} err="failed to get container status \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": rpc error: code = NotFound desc = could not find container \"18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f\": container with ID starting with 18655ad9f2163be1f840baa9a71e1e1aa5147ae8499a66f1e92391853e21c77f not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366360 4774 scope.go:117] "RemoveContainer" containerID="c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366663 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d"} err="failed to get container status \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": rpc error: code = NotFound desc = could not find container \"c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d\": container with ID starting with c5e19526bc1aa5725440559f47cd3d5579b5f11730db43c7643ee13392c2fe8d not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366692 4774 scope.go:117] "RemoveContainer" containerID="be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.366967 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db881c9d-a960-48ae-93bf-d0ccd687e0b9" path="/var/lib/kubelet/pods/db881c9d-a960-48ae-93bf-d0ccd687e0b9/volumes" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.367011 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4"} err="failed to get container status \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": rpc error: code = NotFound desc = could not find container \"be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4\": container with ID starting with be7775cf09f98a0bc191e4a04f0ab4ccbb47f644032cf77c688c94f3f047e3f4 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.367190 4774 scope.go:117] "RemoveContainer" containerID="21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.368886 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37"} err="failed to get container status \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": rpc error: code = NotFound desc = could not find container \"21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37\": container with ID starting with 21c0407beec9f2720afbfca7cfdc3ef4373c6d89ae7b21000c94f49af707dc37 not found: ID does not exist" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.368917 4774 scope.go:117] "RemoveContainer" containerID="b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135" Jan 27 00:18:22 crc kubenswrapper[4774]: I0127 00:18:22.369337 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135"} err="failed to get container status \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": rpc error: code = NotFound desc = could not find container \"b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135\": container with ID starting with b1699424ac9ec216f1bf6c58fd692f1221884161c147d3cb6e8ec9cd67794135 not found: ID does not exist" Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133539 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"fe3b93208880432c7317d8b671d58aeab8649a43fa9b0515bd3c226984526703"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133932 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"755bee5634b393794a6e1807590e0b71ca56a97a4b8e1415956c704a7eca9105"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"bb72bf099ff429daba307aed2238a54bcabd3965341914ad0a47f9a484960668"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133961 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"a54be5b92d1355b2973911ba966b4b9d7385eba1caeb7cf0b62ac06a04f034e9"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133973 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"e745792bcc06dc6a565c7d5cf4c37c30d3e00d55170e55c0e03c8c151fb35750"} Jan 27 00:18:23 crc kubenswrapper[4774]: I0127 00:18:23.133990 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"0afd07e16c7708dcd690cb6d4fefe712540b9e15c6e7d5c11aeba439b5c047c3"} Jan 27 00:18:25 crc kubenswrapper[4774]: I0127 00:18:25.157495 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"a6873f3caf97359917bea9f3d3de8ea2283ad867a4c5bae70d19dd80f7d94cfb"} Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.187385 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" event={"ID":"ba74c484-716d-469a-9ac1-299eb234026a","Type":"ContainerStarted","Data":"a9d31913a209a867409b2d0a7680feb6dac5c275bcb22e87f08ec0743f366b48"} Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.187897 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.187919 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.187931 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.224659 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" podStartSLOduration=7.224640103 podStartE2EDuration="7.224640103s" podCreationTimestamp="2026-01-27 00:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:18:28.220728829 +0000 UTC m=+686.526505723" watchObservedRunningTime="2026-01-27 00:18:28.224640103 +0000 UTC m=+686.530416987" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.224980 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:28 crc kubenswrapper[4774]: I0127 00:18:28.228452 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:18:32 crc kubenswrapper[4774]: I0127 00:18:32.364952 4774 scope.go:117] "RemoveContainer" containerID="b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c" Jan 27 00:18:32 crc kubenswrapper[4774]: E0127 00:18:32.366073 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mtz9l_openshift-multus(0abcf78e-9b05-4b89-94f3-4d3230886ce0)\"" pod="openshift-multus/multus-mtz9l" podUID="0abcf78e-9b05-4b89-94f3-4d3230886ce0" Jan 27 00:18:44 crc kubenswrapper[4774]: I0127 00:18:44.358005 4774 scope.go:117] "RemoveContainer" containerID="b80af4d88f8c0edcc1099c3d9d22e61df1448e662e454061bfd5aab1317d804c" Jan 27 00:18:45 crc kubenswrapper[4774]: I0127 00:18:45.315155 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mtz9l_0abcf78e-9b05-4b89-94f3-4d3230886ce0/kube-multus/2.log" Jan 27 00:18:45 crc kubenswrapper[4774]: I0127 00:18:45.315706 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mtz9l" event={"ID":"0abcf78e-9b05-4b89-94f3-4d3230886ce0","Type":"ContainerStarted","Data":"958f1ed8b4448031ab68fded275acfa4629d5770bab597ec1754a1cd5bd04722"} Jan 27 00:18:51 crc kubenswrapper[4774]: I0127 00:18:51.665730 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hxmlk" Jan 27 00:19:06 crc kubenswrapper[4774]: I0127 00:19:06.675751 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:19:06 crc kubenswrapper[4774]: I0127 00:19:06.677140 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.391907 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.394811 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gb9l9" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="registry-server" containerID="cri-o://ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f" gracePeriod=30 Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.648195 4774 generic.go:334] "Generic (PLEG): container finished" podID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerID="ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f" exitCode=0 Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.648264 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerDied","Data":"ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f"} Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.805921 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.977463 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") pod \"36542624-2b85-40c0-a571-9ded4d2dcb9b\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.977572 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") pod \"36542624-2b85-40c0-a571-9ded4d2dcb9b\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.977640 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") pod \"36542624-2b85-40c0-a571-9ded4d2dcb9b\" (UID: \"36542624-2b85-40c0-a571-9ded4d2dcb9b\") " Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.978981 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities" (OuterVolumeSpecName: "utilities") pod "36542624-2b85-40c0-a571-9ded4d2dcb9b" (UID: "36542624-2b85-40c0-a571-9ded4d2dcb9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:30 crc kubenswrapper[4774]: I0127 00:19:30.993771 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw" (OuterVolumeSpecName: "kube-api-access-wzbhw") pod "36542624-2b85-40c0-a571-9ded4d2dcb9b" (UID: "36542624-2b85-40c0-a571-9ded4d2dcb9b"). InnerVolumeSpecName "kube-api-access-wzbhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.001939 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36542624-2b85-40c0-a571-9ded4d2dcb9b" (UID: "36542624-2b85-40c0-a571-9ded4d2dcb9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.079976 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.080313 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzbhw\" (UniqueName: \"kubernetes.io/projected/36542624-2b85-40c0-a571-9ded4d2dcb9b-kube-api-access-wzbhw\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.080383 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36542624-2b85-40c0-a571-9ded4d2dcb9b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.659273 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb9l9" event={"ID":"36542624-2b85-40c0-a571-9ded4d2dcb9b","Type":"ContainerDied","Data":"bc02ff041620d6abe4f9549d299ee2ebf620ffad6df25e35936cc68f50fc4bcd"} Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.659780 4774 scope.go:117] "RemoveContainer" containerID="ab6b4ff7811ed8cdc4187956ce47abd65643238fa2c4fc2041db620b531de40f" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.659377 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb9l9" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.705653 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.708398 4774 scope.go:117] "RemoveContainer" containerID="20d5f60ef29061380fe8995d910d907f50edc601d69e30e44c129e980a103c38" Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.730131 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb9l9"] Jan 27 00:19:31 crc kubenswrapper[4774]: I0127 00:19:31.766763 4774 scope.go:117] "RemoveContainer" containerID="51eaded7e7d2f079a111fb785033a43f6ec05414382be88a1e655bf429602842" Jan 27 00:19:32 crc kubenswrapper[4774]: I0127 00:19:32.369758 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" path="/var/lib/kubelet/pods/36542624-2b85-40c0-a571-9ded4d2dcb9b/volumes" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.526821 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2"] Jan 27 00:19:34 crc kubenswrapper[4774]: E0127 00:19:34.527476 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="registry-server" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.527492 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="registry-server" Jan 27 00:19:34 crc kubenswrapper[4774]: E0127 00:19:34.527518 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="extract-utilities" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.527527 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="extract-utilities" Jan 27 00:19:34 crc kubenswrapper[4774]: E0127 00:19:34.527543 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="extract-content" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.527551 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="extract-content" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.527670 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="36542624-2b85-40c0-a571-9ded4d2dcb9b" containerName="registry-server" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.528600 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.531025 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.540215 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2"] Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.557280 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.557849 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.557947 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.659463 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.659517 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.659539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.660458 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.660571 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.679989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:34 crc kubenswrapper[4774]: I0127 00:19:34.851058 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:35 crc kubenswrapper[4774]: I0127 00:19:35.266773 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2"] Jan 27 00:19:35 crc kubenswrapper[4774]: I0127 00:19:35.684100 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerStarted","Data":"d8bfeab9be4b695cf6031430494c06a374a7a674a2bc6e25bc4e6797521a722b"} Jan 27 00:19:35 crc kubenswrapper[4774]: I0127 00:19:35.684454 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerStarted","Data":"97e1c6bd5cd72e557d94c68a32981b66a99aed46052345d6d84fff6b3f9a03d8"} Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.675582 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.675676 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.698263 4774 generic.go:334] "Generic (PLEG): container finished" podID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerID="d8bfeab9be4b695cf6031430494c06a374a7a674a2bc6e25bc4e6797521a722b" exitCode=0 Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.698331 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerDied","Data":"d8bfeab9be4b695cf6031430494c06a374a7a674a2bc6e25bc4e6797521a722b"} Jan 27 00:19:36 crc kubenswrapper[4774]: I0127 00:19:36.701090 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:19:38 crc kubenswrapper[4774]: I0127 00:19:38.716122 4774 generic.go:334] "Generic (PLEG): container finished" podID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerID="596d0d773d00494f782d77dbf041bb0e68fdbd48057eb27f130de14f4558ec47" exitCode=0 Jan 27 00:19:38 crc kubenswrapper[4774]: I0127 00:19:38.716281 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerDied","Data":"596d0d773d00494f782d77dbf041bb0e68fdbd48057eb27f130de14f4558ec47"} Jan 27 00:19:39 crc kubenswrapper[4774]: I0127 00:19:39.725504 4774 generic.go:334] "Generic (PLEG): container finished" podID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerID="5e13ab57abb6cae526ec9bcd9a02622712c7d2f7fd61e820ecb36b6e98e5d7b1" exitCode=0 Jan 27 00:19:39 crc kubenswrapper[4774]: I0127 00:19:39.725592 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerDied","Data":"5e13ab57abb6cae526ec9bcd9a02622712c7d2f7fd61e820ecb36b6e98e5d7b1"} Jan 27 00:19:40 crc kubenswrapper[4774]: I0127 00:19:40.960475 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.059199 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") pod \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.059258 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") pod \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.059340 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") pod \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\" (UID: \"f2086af6-4ed5-4f66-bf01-aa661ba5a168\") " Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.061787 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle" (OuterVolumeSpecName: "bundle") pod "f2086af6-4ed5-4f66-bf01-aa661ba5a168" (UID: "f2086af6-4ed5-4f66-bf01-aa661ba5a168"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.068112 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj" (OuterVolumeSpecName: "kube-api-access-g27tj") pod "f2086af6-4ed5-4f66-bf01-aa661ba5a168" (UID: "f2086af6-4ed5-4f66-bf01-aa661ba5a168"). InnerVolumeSpecName "kube-api-access-g27tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.071477 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util" (OuterVolumeSpecName: "util") pod "f2086af6-4ed5-4f66-bf01-aa661ba5a168" (UID: "f2086af6-4ed5-4f66-bf01-aa661ba5a168"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.160877 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.160917 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g27tj\" (UniqueName: \"kubernetes.io/projected/f2086af6-4ed5-4f66-bf01-aa661ba5a168-kube-api-access-g27tj\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.160929 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2086af6-4ed5-4f66-bf01-aa661ba5a168-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.738804 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" event={"ID":"f2086af6-4ed5-4f66-bf01-aa661ba5a168","Type":"ContainerDied","Data":"97e1c6bd5cd72e557d94c68a32981b66a99aed46052345d6d84fff6b3f9a03d8"} Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.739321 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97e1c6bd5cd72e557d94c68a32981b66a99aed46052345d6d84fff6b3f9a03d8" Jan 27 00:19:41 crc kubenswrapper[4774]: I0127 00:19:41.738949 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779381 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb"] Jan 27 00:19:42 crc kubenswrapper[4774]: E0127 00:19:42.779654 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="pull" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779671 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="pull" Jan 27 00:19:42 crc kubenswrapper[4774]: E0127 00:19:42.779688 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="extract" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779700 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="extract" Jan 27 00:19:42 crc kubenswrapper[4774]: E0127 00:19:42.779727 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="util" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779735 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="util" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.779854 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2086af6-4ed5-4f66-bf01-aa661ba5a168" containerName="extract" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.780793 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.784366 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.805756 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb"] Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.882800 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.882905 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.882964 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.890008 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.891285 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.907907 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984287 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984378 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984425 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984503 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.984618 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.985005 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:42 crc kubenswrapper[4774]: I0127 00:19:42.985099 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.012943 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.062020 4774 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.086373 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.086458 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.086486 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.087164 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.087336 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.105816 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") pod \"redhat-operators-pdxh5\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.116884 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.220548 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.382003 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb"] Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.474440 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.547591 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b"] Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.548556 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.560939 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b"] Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.694427 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.695107 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.695231 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.762534 4774 generic.go:334] "Generic (PLEG): container finished" podID="4c9c1faf-541d-4491-b37e-99909c74944b" containerID="44bf8961053363986c01a6554172bfd8b05751670ed90f16bf1e2f86b2ba58dc" exitCode=0 Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.762588 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerDied","Data":"44bf8961053363986c01a6554172bfd8b05751670ed90f16bf1e2f86b2ba58dc"} Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.762649 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerStarted","Data":"23a340b31e207dc1319a830ec2ed78f64feba461f1f191f536b89271a9b624f9"} Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.764639 4774 generic.go:334] "Generic (PLEG): container finished" podID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerID="ee8f24b32a8f3f27163d6f15de8e84175a69f228c191d9c0adacc759c7ce0121" exitCode=0 Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.764666 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerDied","Data":"ee8f24b32a8f3f27163d6f15de8e84175a69f228c191d9c0adacc759c7ce0121"} Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.764681 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerStarted","Data":"f686dbacce0862c41ca0b35de88c3a41d09cd938400978d49b9794435f651b3e"} Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.797137 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.797202 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.797229 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.798170 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.798271 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.830193 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:43 crc kubenswrapper[4774]: I0127 00:19:43.861362 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:44 crc kubenswrapper[4774]: I0127 00:19:44.075598 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b"] Jan 27 00:19:44 crc kubenswrapper[4774]: W0127 00:19:44.089971 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3f5387_0b7c_4c6d_8aff_e293c038aafb.slice/crio-fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f WatchSource:0}: Error finding container fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f: Status 404 returned error can't find the container with id fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f Jan 27 00:19:44 crc kubenswrapper[4774]: I0127 00:19:44.777978 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerID="103d19fe0051a0aa6b54cfba0e223333a3f8830a2651e9586c3a424245aa9a74" exitCode=0 Jan 27 00:19:44 crc kubenswrapper[4774]: I0127 00:19:44.778048 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerDied","Data":"103d19fe0051a0aa6b54cfba0e223333a3f8830a2651e9586c3a424245aa9a74"} Jan 27 00:19:44 crc kubenswrapper[4774]: I0127 00:19:44.778089 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerStarted","Data":"fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f"} Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.784576 4774 generic.go:334] "Generic (PLEG): container finished" podID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerID="6dcbc08c278885cd2711653b6cf428e3c4856ecf3d3fb60fd92a0d6210e72762" exitCode=0 Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.784664 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerDied","Data":"6dcbc08c278885cd2711653b6cf428e3c4856ecf3d3fb60fd92a0d6210e72762"} Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.787552 4774 generic.go:334] "Generic (PLEG): container finished" podID="4c9c1faf-541d-4491-b37e-99909c74944b" containerID="92dbedab7e3a765202cdab5c278b77c503ff9fdad67f1ad130701e4b0d133e1e" exitCode=0 Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.787652 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerDied","Data":"92dbedab7e3a765202cdab5c278b77c503ff9fdad67f1ad130701e4b0d133e1e"} Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.791069 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerID="7cbb5b9beaea9330095805348572843bf7b2a5e4897bdf713b917fe46bc90ea9" exitCode=0 Jan 27 00:19:45 crc kubenswrapper[4774]: I0127 00:19:45.791131 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerDied","Data":"7cbb5b9beaea9330095805348572843bf7b2a5e4897bdf713b917fe46bc90ea9"} Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.799017 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerStarted","Data":"27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3"} Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.802894 4774 generic.go:334] "Generic (PLEG): container finished" podID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerID="826946182f64aca9f23e2c531367ec09b2c0c5d61341612c15408d7ac154875d" exitCode=0 Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.803066 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerDied","Data":"826946182f64aca9f23e2c531367ec09b2c0c5d61341612c15408d7ac154875d"} Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.805665 4774 generic.go:334] "Generic (PLEG): container finished" podID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerID="ce083fb15a33bddd0ad4aad29742003d6de2d178017df06cbbf71d4c4d53f156" exitCode=0 Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.805765 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerDied","Data":"ce083fb15a33bddd0ad4aad29742003d6de2d178017df06cbbf71d4c4d53f156"} Jan 27 00:19:46 crc kubenswrapper[4774]: I0127 00:19:46.831951 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pdxh5" podStartSLOduration=2.384887124 podStartE2EDuration="4.831929866s" podCreationTimestamp="2026-01-27 00:19:42 +0000 UTC" firstStartedPulling="2026-01-27 00:19:43.76475625 +0000 UTC m=+762.070533124" lastFinishedPulling="2026-01-27 00:19:46.211798982 +0000 UTC m=+764.517575866" observedRunningTime="2026-01-27 00:19:46.829181753 +0000 UTC m=+765.134958637" watchObservedRunningTime="2026-01-27 00:19:46.831929866 +0000 UTC m=+765.137706750" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.449380 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq"] Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.450477 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.477682 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq"] Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.557734 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.557894 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.557976 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.659076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.659512 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.659630 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.659564 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.660030 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.689369 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:47 crc kubenswrapper[4774]: I0127 00:19:47.766917 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.257886 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.326017 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.370547 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") pod \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372659 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") pod \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") pod \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372732 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") pod \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372755 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") pod \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\" (UID: \"1b3f5387-0b7c-4c6d-8aff-e293c038aafb\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.372872 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") pod \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\" (UID: \"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06\") " Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.376463 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp" (OuterVolumeSpecName: "kube-api-access-gd4tp") pod "1b3f5387-0b7c-4c6d-8aff-e293c038aafb" (UID: "1b3f5387-0b7c-4c6d-8aff-e293c038aafb"). InnerVolumeSpecName "kube-api-access-gd4tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.377463 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle" (OuterVolumeSpecName: "bundle") pod "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" (UID: "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.377736 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle" (OuterVolumeSpecName: "bundle") pod "1b3f5387-0b7c-4c6d-8aff-e293c038aafb" (UID: "1b3f5387-0b7c-4c6d-8aff-e293c038aafb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.380877 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd" (OuterVolumeSpecName: "kube-api-access-5z7wd") pod "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" (UID: "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06"). InnerVolumeSpecName "kube-api-access-5z7wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.390385 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util" (OuterVolumeSpecName: "util") pod "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" (UID: "bcd9000d-51f5-47d1-9fb0-a1177a6c6b06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.395815 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util" (OuterVolumeSpecName: "util") pod "1b3f5387-0b7c-4c6d-8aff-e293c038aafb" (UID: "1b3f5387-0b7c-4c6d-8aff-e293c038aafb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498735 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498794 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498810 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z7wd\" (UniqueName: \"kubernetes.io/projected/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-kube-api-access-5z7wd\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498823 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.498840 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bcd9000d-51f5-47d1-9fb0-a1177a6c6b06-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.503949 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd4tp\" (UniqueName: \"kubernetes.io/projected/1b3f5387-0b7c-4c6d-8aff-e293c038aafb-kube-api-access-gd4tp\") on node \"crc\" DevicePath \"\"" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.551583 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq"] Jan 27 00:19:48 crc kubenswrapper[4774]: W0127 00:19:48.586127 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bdb298d_0c46_429d_b4c2_44d106881eb7.slice/crio-0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef WatchSource:0}: Error finding container 0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef: Status 404 returned error can't find the container with id 0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592377 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592576 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="pull" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592593 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="pull" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592602 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592610 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592620 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="util" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592626 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="util" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592639 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="util" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592645 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="util" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592653 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="pull" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592658 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="pull" Jan 27 00:19:48 crc kubenswrapper[4774]: E0127 00:19:48.592668 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592673 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592762 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3f5387-0b7c-4c6d-8aff-e293c038aafb" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.592775 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd9000d-51f5-47d1-9fb0-a1177a6c6b06" containerName="extract" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.593523 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.624652 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.706810 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.706877 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.706898 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808096 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808155 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808178 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808731 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.808773 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.828025 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") pod \"certified-operators-qdczk\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.846413 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" event={"ID":"1b3f5387-0b7c-4c6d-8aff-e293c038aafb","Type":"ContainerDied","Data":"fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f"} Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.846451 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.846474 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd923b040df120c4e0db13faba3dd082748215832fde69efa3820a548d40466f" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.847903 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerStarted","Data":"efb67e23acbfdf21e1128206db0796da92a2b8b3505fcf7184c374d562e0ebb6"} Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.847934 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerStarted","Data":"0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef"} Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.850350 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" event={"ID":"bcd9000d-51f5-47d1-9fb0-a1177a6c6b06","Type":"ContainerDied","Data":"f686dbacce0862c41ca0b35de88c3a41d09cd938400978d49b9794435f651b3e"} Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.850396 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f686dbacce0862c41ca0b35de88c3a41d09cd938400978d49b9794435f651b3e" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.850447 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb" Jan 27 00:19:48 crc kubenswrapper[4774]: I0127 00:19:48.907117 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.282436 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.857438 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerID="efb67e23acbfdf21e1128206db0796da92a2b8b3505fcf7184c374d562e0ebb6" exitCode=0 Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.857534 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerDied","Data":"efb67e23acbfdf21e1128206db0796da92a2b8b3505fcf7184c374d562e0ebb6"} Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.859317 4774 generic.go:334] "Generic (PLEG): container finished" podID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerID="4c26c2e2994e8be6a1fd5545240fd4ba414fbf6378b30c5235cd400317e57ad6" exitCode=0 Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.859347 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerDied","Data":"4c26c2e2994e8be6a1fd5545240fd4ba414fbf6378b30c5235cd400317e57ad6"} Jan 27 00:19:49 crc kubenswrapper[4774]: I0127 00:19:49.859363 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerStarted","Data":"c7f9a533937e391d5a101f3230a6fdf47cd308c1e2082bc57db248e561eaab18"} Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.700973 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.702788 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.728594 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.767212 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.767295 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.767320 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.868378 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.868429 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.868478 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.869136 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.869166 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.911825 4774 generic.go:334] "Generic (PLEG): container finished" podID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerID="200e6d7e4a8ae08ab9f21528d73e613a734eef3b1ea5439197e326a0aac07a73" exitCode=0 Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.911889 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerDied","Data":"200e6d7e4a8ae08ab9f21528d73e613a734eef3b1ea5439197e326a0aac07a73"} Jan 27 00:19:51 crc kubenswrapper[4774]: I0127 00:19:51.929238 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") pod \"community-operators-9wztc\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.018386 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.362701 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.921157 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerStarted","Data":"9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180"} Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.927590 4774 generic.go:334] "Generic (PLEG): container finished" podID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerID="e02d219daaca4ec67fc454d718eec7db071fdd5266115161621d49cc1c997588" exitCode=0 Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.927640 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerDied","Data":"e02d219daaca4ec67fc454d718eec7db071fdd5266115161621d49cc1c997588"} Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.927669 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerStarted","Data":"f277365d77be95b43a841265074b0061aae099b70be4de813c5361a232b01168"} Jan 27 00:19:52 crc kubenswrapper[4774]: I0127 00:19:52.966220 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qdczk" podStartSLOduration=2.439468087 podStartE2EDuration="4.966197876s" podCreationTimestamp="2026-01-27 00:19:48 +0000 UTC" firstStartedPulling="2026-01-27 00:19:49.861753374 +0000 UTC m=+768.167530258" lastFinishedPulling="2026-01-27 00:19:52.388483163 +0000 UTC m=+770.694260047" observedRunningTime="2026-01-27 00:19:52.963277148 +0000 UTC m=+771.269054052" watchObservedRunningTime="2026-01-27 00:19:52.966197876 +0000 UTC m=+771.271974760" Jan 27 00:19:53 crc kubenswrapper[4774]: I0127 00:19:53.220985 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:53 crc kubenswrapper[4774]: I0127 00:19:53.221038 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:19:54 crc kubenswrapper[4774]: I0127 00:19:54.261337 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pdxh5" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" probeResult="failure" output=< Jan 27 00:19:54 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:19:54 crc kubenswrapper[4774]: > Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.399676 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.400943 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.404630 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.404921 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-sp9r2" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.405751 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.433791 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.522758 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqdl8\" (UniqueName: \"kubernetes.io/projected/82ad6e88-a32b-4f4f-9a96-66d10c58a7d9-kube-api-access-qqdl8\") pod \"obo-prometheus-operator-68bc856cb9-qzvj5\" (UID: \"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.532754 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.533803 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.536270 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-mbppc" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.542455 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.543565 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.544289 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.555163 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.563514 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.623879 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.623945 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.623984 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqdl8\" (UniqueName: \"kubernetes.io/projected/82ad6e88-a32b-4f4f-9a96-66d10c58a7d9-kube-api-access-qqdl8\") pod \"obo-prometheus-operator-68bc856cb9-qzvj5\" (UID: \"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.624283 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.624524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.662143 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqdl8\" (UniqueName: \"kubernetes.io/projected/82ad6e88-a32b-4f4f-9a96-66d10c58a7d9-kube-api-access-qqdl8\") pod \"obo-prometheus-operator-68bc856cb9-qzvj5\" (UID: \"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.726228 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.726325 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.726374 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.726404 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.732586 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.734372 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.735052 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfee63d3-9a5d-46f6-b984-78d6a837e20c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-2b75r\" (UID: \"dfee63d3-9a5d-46f6-b984-78d6a837e20c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.739114 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ac18775-726a-43da-a184-dfd1565544f1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6495c554dc-548dl\" (UID: \"4ac18775-726a-43da-a184-dfd1565544f1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.739313 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.789486 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fqps4"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.790593 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.797228 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.800823 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-6l2g4" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.805615 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fqps4"] Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.854287 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.861545 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.930126 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2222c\" (UniqueName: \"kubernetes.io/projected/3247d37e-1277-411a-ad8b-ffcd6172206f-kube-api-access-2222c\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.930195 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3247d37e-1277-411a-ad8b-ffcd6172206f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.969493 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerStarted","Data":"51c589dd8edac479f82cbd8c35b089e741262f8274ee231b1ad43eb7751efc5f"} Jan 27 00:19:55 crc kubenswrapper[4774]: I0127 00:19:55.986305 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerStarted","Data":"6adb30d15bae1199f3712c4a8352afcec6a4c2a58544b7c28ed4b9cf80a6f3bb"} Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.031088 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2222c\" (UniqueName: \"kubernetes.io/projected/3247d37e-1277-411a-ad8b-ffcd6172206f-kube-api-access-2222c\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.031181 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3247d37e-1277-411a-ad8b-ffcd6172206f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.036756 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3247d37e-1277-411a-ad8b-ffcd6172206f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.068607 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2222c\" (UniqueName: \"kubernetes.io/projected/3247d37e-1277-411a-ad8b-ffcd6172206f-kube-api-access-2222c\") pod \"observability-operator-59bdc8b94-fqps4\" (UID: \"3247d37e-1277-411a-ad8b-ffcd6172206f\") " pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.098935 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b7pt6"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.099668 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.105069 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-r49fd" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.108586 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.119799 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b7pt6"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.134827 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9nqq\" (UniqueName: \"kubernetes.io/projected/96fc7bad-ed57-4110-afa8-9a6e5748c292-kube-api-access-z9nqq\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.134960 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96fc7bad-ed57-4110-afa8-9a6e5748c292-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.238342 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9nqq\" (UniqueName: \"kubernetes.io/projected/96fc7bad-ed57-4110-afa8-9a6e5748c292-kube-api-access-z9nqq\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.238406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96fc7bad-ed57-4110-afa8-9a6e5748c292-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.239627 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/96fc7bad-ed57-4110-afa8-9a6e5748c292-openshift-service-ca\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.303529 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xcbvz"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.304511 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.306197 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.306471 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-4gbp9" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.309292 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xcbvz"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.312751 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.319756 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9nqq\" (UniqueName: \"kubernetes.io/projected/96fc7bad-ed57-4110-afa8-9a6e5748c292-kube-api-access-z9nqq\") pod \"perses-operator-5bf474d74f-b7pt6\" (UID: \"96fc7bad-ed57-4110-afa8-9a6e5748c292\") " pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.341090 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzln8\" (UniqueName: \"kubernetes.io/projected/f2a5d99b-17f5-4a46-958b-ae997b57245e-kube-api-access-rzln8\") pod \"interconnect-operator-5bb49f789d-xcbvz\" (UID: \"f2a5d99b-17f5-4a46-958b-ae997b57245e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.389602 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.421947 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl"] Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.443910 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzln8\" (UniqueName: \"kubernetes.io/projected/f2a5d99b-17f5-4a46-958b-ae997b57245e-kube-api-access-rzln8\") pod \"interconnect-operator-5bb49f789d-xcbvz\" (UID: \"f2a5d99b-17f5-4a46-958b-ae997b57245e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:56 crc kubenswrapper[4774]: W0127 00:19:56.444488 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ad6e88_a32b_4f4f_9a96_66d10c58a7d9.slice/crio-284d11414e55ca8dab56eea35d18f671d13330367fb3820e00dd13656d6118b8 WatchSource:0}: Error finding container 284d11414e55ca8dab56eea35d18f671d13330367fb3820e00dd13656d6118b8: Status 404 returned error can't find the container with id 284d11414e55ca8dab56eea35d18f671d13330367fb3820e00dd13656d6118b8 Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.472701 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzln8\" (UniqueName: \"kubernetes.io/projected/f2a5d99b-17f5-4a46-958b-ae997b57245e-kube-api-access-rzln8\") pod \"interconnect-operator-5bb49f789d-xcbvz\" (UID: \"f2a5d99b-17f5-4a46-958b-ae997b57245e\") " pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.476576 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:19:56 crc kubenswrapper[4774]: I0127 00:19:56.599738 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-fqps4"] Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.615429 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r"] Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.661790 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.996327 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" event={"ID":"4ac18775-726a-43da-a184-dfd1565544f1","Type":"ContainerStarted","Data":"710fda1e9451d191c352c52b5bb38bf519dc16b3bce39730f227bbf272a04907"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.998909 4774 generic.go:334] "Generic (PLEG): container finished" podID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerID="6adb30d15bae1199f3712c4a8352afcec6a4c2a58544b7c28ed4b9cf80a6f3bb" exitCode=0 Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:56.999075 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerDied","Data":"6adb30d15bae1199f3712c4a8352afcec6a4c2a58544b7c28ed4b9cf80a6f3bb"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:57.001997 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" event={"ID":"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9","Type":"ContainerStarted","Data":"284d11414e55ca8dab56eea35d18f671d13330367fb3820e00dd13656d6118b8"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:57.003285 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" event={"ID":"dfee63d3-9a5d-46f6-b984-78d6a837e20c","Type":"ContainerStarted","Data":"09d1242fce25c6ddeeab10eb41618a1e4c826f0c9b1869609c276595691e4b15"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:57.004909 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" event={"ID":"3247d37e-1277-411a-ad8b-ffcd6172206f","Type":"ContainerStarted","Data":"6ef9d09a6906441a4b6dc94f41942b47e9cef322d2ab69022a11232682410f9e"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.021890 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerID="51c589dd8edac479f82cbd8c35b089e741262f8274ee231b1ad43eb7751efc5f" exitCode=0 Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.021947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerDied","Data":"51c589dd8edac479f82cbd8c35b089e741262f8274ee231b1ad43eb7751efc5f"} Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.409798 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-b7pt6"] Jan 27 00:19:58 crc kubenswrapper[4774]: W0127 00:19:58.452928 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96fc7bad_ed57_4110_afa8_9a6e5748c292.slice/crio-ee01d005e5e2a3afbb3cfe092ef36b4f0edf40fdb33cc589d50a166a2f084c93 WatchSource:0}: Error finding container ee01d005e5e2a3afbb3cfe092ef36b4f0edf40fdb33cc589d50a166a2f084c93: Status 404 returned error can't find the container with id ee01d005e5e2a3afbb3cfe092ef36b4f0edf40fdb33cc589d50a166a2f084c93 Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.674987 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-xcbvz"] Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.908240 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.908297 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:58 crc kubenswrapper[4774]: I0127 00:19:58.976056 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.046850 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerStarted","Data":"c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220"} Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.057322 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" event={"ID":"f2a5d99b-17f5-4a46-958b-ae997b57245e","Type":"ContainerStarted","Data":"49b9392f4bdc8a90e41d7d787075c697016901a5f86ec0aeebdd4cad653eaeb0"} Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.066569 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" event={"ID":"96fc7bad-ed57-4110-afa8-9a6e5748c292","Type":"ContainerStarted","Data":"ee01d005e5e2a3afbb3cfe092ef36b4f0edf40fdb33cc589d50a166a2f084c93"} Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.076002 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9wztc" podStartSLOduration=2.394659352 podStartE2EDuration="8.075973951s" podCreationTimestamp="2026-01-27 00:19:51 +0000 UTC" firstStartedPulling="2026-01-27 00:19:52.942085061 +0000 UTC m=+771.247861945" lastFinishedPulling="2026-01-27 00:19:58.62339966 +0000 UTC m=+776.929176544" observedRunningTime="2026-01-27 00:19:59.072432127 +0000 UTC m=+777.378209021" watchObservedRunningTime="2026-01-27 00:19:59.075973951 +0000 UTC m=+777.381750855" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.077802 4774 generic.go:334] "Generic (PLEG): container finished" podID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerID="a3e4452fbf1853f438d267a232202d0745bc9c8697654b5a77a29ceb58fd6789" exitCode=0 Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.078606 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerDied","Data":"a3e4452fbf1853f438d267a232202d0745bc9c8697654b5a77a29ceb58fd6789"} Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.148323 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.576632 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-688b5775f4-bvrcz"] Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.577507 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.581287 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-c9bfw" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.581513 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.600441 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-688b5775f4-bvrcz"] Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.712107 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-apiservice-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.712154 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gszqw\" (UniqueName: \"kubernetes.io/projected/e6700f3e-4423-48fe-94ae-562483cf3a18-kube-api-access-gszqw\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.712217 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-webhook-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.813955 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-webhook-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.814081 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-apiservice-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.814111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gszqw\" (UniqueName: \"kubernetes.io/projected/e6700f3e-4423-48fe-94ae-562483cf3a18-kube-api-access-gszqw\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.822937 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-webhook-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.831727 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6700f3e-4423-48fe-94ae-562483cf3a18-apiservice-cert\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.845673 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gszqw\" (UniqueName: \"kubernetes.io/projected/e6700f3e-4423-48fe-94ae-562483cf3a18-kube-api-access-gszqw\") pod \"elastic-operator-688b5775f4-bvrcz\" (UID: \"e6700f3e-4423-48fe-94ae-562483cf3a18\") " pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:19:59 crc kubenswrapper[4774]: I0127 00:19:59.896211 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.261960 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-688b5775f4-bvrcz"] Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.537519 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.630367 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") pod \"4bdb298d-0c46-429d-b4c2-44d106881eb7\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.630488 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") pod \"4bdb298d-0c46-429d-b4c2-44d106881eb7\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.630549 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") pod \"4bdb298d-0c46-429d-b4c2-44d106881eb7\" (UID: \"4bdb298d-0c46-429d-b4c2-44d106881eb7\") " Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.631745 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle" (OuterVolumeSpecName: "bundle") pod "4bdb298d-0c46-429d-b4c2-44d106881eb7" (UID: "4bdb298d-0c46-429d-b4c2-44d106881eb7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.638040 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p" (OuterVolumeSpecName: "kube-api-access-dw46p") pod "4bdb298d-0c46-429d-b4c2-44d106881eb7" (UID: "4bdb298d-0c46-429d-b4c2-44d106881eb7"). InnerVolumeSpecName "kube-api-access-dw46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.645784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util" (OuterVolumeSpecName: "util") pod "4bdb298d-0c46-429d-b4c2-44d106881eb7" (UID: "4bdb298d-0c46-429d-b4c2-44d106881eb7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.732641 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.732699 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw46p\" (UniqueName: \"kubernetes.io/projected/4bdb298d-0c46-429d-b4c2-44d106881eb7-kube-api-access-dw46p\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:00 crc kubenswrapper[4774]: I0127 00:20:00.732715 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4bdb298d-0c46-429d-b4c2-44d106881eb7-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:01 crc kubenswrapper[4774]: I0127 00:20:01.116969 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" event={"ID":"e6700f3e-4423-48fe-94ae-562483cf3a18","Type":"ContainerStarted","Data":"494fd68295d72075193ae11fc25cf75b4b825a4ee078b8e85903808e0a46e289"} Jan 27 00:20:01 crc kubenswrapper[4774]: I0127 00:20:01.119771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" event={"ID":"4bdb298d-0c46-429d-b4c2-44d106881eb7","Type":"ContainerDied","Data":"0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef"} Jan 27 00:20:01 crc kubenswrapper[4774]: I0127 00:20:01.119792 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b3a751df39c09e406c339e045a1a8e26cff9cffe0a00cfe8379e61eb23fe3ef" Jan 27 00:20:01 crc kubenswrapper[4774]: I0127 00:20:01.119895 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq" Jan 27 00:20:02 crc kubenswrapper[4774]: I0127 00:20:02.018880 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:02 crc kubenswrapper[4774]: I0127 00:20:02.019212 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:02 crc kubenswrapper[4774]: I0127 00:20:02.074030 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:03 crc kubenswrapper[4774]: I0127 00:20:03.294038 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:20:03 crc kubenswrapper[4774]: I0127 00:20:03.351187 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:20:04 crc kubenswrapper[4774]: I0127 00:20:04.297724 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:20:04 crc kubenswrapper[4774]: I0127 00:20:04.298387 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qdczk" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" containerID="cri-o://9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" gracePeriod=2 Jan 27 00:20:05 crc kubenswrapper[4774]: I0127 00:20:05.155833 4774 generic.go:334] "Generic (PLEG): container finished" podID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" exitCode=0 Jan 27 00:20:05 crc kubenswrapper[4774]: I0127 00:20:05.155896 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerDied","Data":"9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180"} Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.675716 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.675776 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.675819 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.676417 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:20:06 crc kubenswrapper[4774]: I0127 00:20:06.676479 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442" gracePeriod=600 Jan 27 00:20:07 crc kubenswrapper[4774]: I0127 00:20:07.170398 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442"} Jan 27 00:20:07 crc kubenswrapper[4774]: I0127 00:20:07.170480 4774 scope.go:117] "RemoveContainer" containerID="1835e32e3ad8e493de6e8f27a5dbcc7b2eb8f1908c27b56eb3a4006aa36c9b0d" Jan 27 00:20:07 crc kubenswrapper[4774]: I0127 00:20:07.170390 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442" exitCode=0 Jan 27 00:20:08 crc kubenswrapper[4774]: I0127 00:20:08.682161 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:20:08 crc kubenswrapper[4774]: I0127 00:20:08.682741 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pdxh5" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" containerID="cri-o://27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" gracePeriod=2 Jan 27 00:20:08 crc kubenswrapper[4774]: E0127 00:20:08.910412 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180 is running failed: container process not found" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:08 crc kubenswrapper[4774]: E0127 00:20:08.911434 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180 is running failed: container process not found" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:08 crc kubenswrapper[4774]: E0127 00:20:08.912073 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180 is running failed: container process not found" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:08 crc kubenswrapper[4774]: E0127 00:20:08.912112 4774 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-qdczk" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" Jan 27 00:20:09 crc kubenswrapper[4774]: I0127 00:20:09.212662 4774 generic.go:334] "Generic (PLEG): container finished" podID="4c9c1faf-541d-4491-b37e-99909c74944b" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" exitCode=0 Jan 27 00:20:09 crc kubenswrapper[4774]: I0127 00:20:09.212699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerDied","Data":"27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3"} Jan 27 00:20:12 crc kubenswrapper[4774]: I0127 00:20:12.116750 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:13 crc kubenswrapper[4774]: I0127 00:20:13.085154 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:20:13 crc kubenswrapper[4774]: I0127 00:20:13.085438 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9wztc" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="registry-server" containerID="cri-o://c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220" gracePeriod=2 Jan 27 00:20:13 crc kubenswrapper[4774]: E0127 00:20:13.221953 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3 is running failed: container process not found" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:13 crc kubenswrapper[4774]: E0127 00:20:13.222450 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3 is running failed: container process not found" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:13 crc kubenswrapper[4774]: E0127 00:20:13.222685 4774 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3 is running failed: container process not found" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 00:20:13 crc kubenswrapper[4774]: E0127 00:20:13.222718 4774 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-pdxh5" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" Jan 27 00:20:13 crc kubenswrapper[4774]: I0127 00:20:13.250190 4774 generic.go:334] "Generic (PLEG): container finished" podID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerID="c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220" exitCode=0 Jan 27 00:20:13 crc kubenswrapper[4774]: I0127 00:20:13.250259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerDied","Data":"c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220"} Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.979244 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw"] Jan 27 00:20:15 crc kubenswrapper[4774]: E0127 00:20:15.980095 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="util" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980129 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="util" Jan 27 00:20:15 crc kubenswrapper[4774]: E0127 00:20:15.980156 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="extract" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980162 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="extract" Jan 27 00:20:15 crc kubenswrapper[4774]: E0127 00:20:15.980170 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="pull" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980177 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="pull" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980286 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdb298d-0c46-429d-b4c2-44d106881eb7" containerName="extract" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.980727 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.982667 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.983536 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-fnw7h" Jan 27 00:20:15 crc kubenswrapper[4774]: I0127 00:20:15.983981 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.008243 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw"] Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.093471 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de9ae8e-6adf-4a50-9482-0ca1d1691559-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.093570 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lfj\" (UniqueName: \"kubernetes.io/projected/6de9ae8e-6adf-4a50-9482-0ca1d1691559-kube-api-access-l8lfj\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.129126 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.129440 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:dc62889b883f597de91b5389cc52c84c607247d49a807693be2f688e4703dfc3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:1b555e21bba7c609111ace4380382a696d9aceeb6e9816bf9023b8f689b6c741,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:a223bab813b82d698992490bbb60927f6288a83ba52d539836c250e1471f6d34,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:e797cdb47beef40b04da7b6d645bca3dc32e6247003c45b56b38efd9e13bf01c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:093d2731ac848ed5fd57356b155a19d3bf7b8db96d95b09c5d0095e143f7254f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:7d662a120305e2528acc7e9142b770b5b6a7f4932ddfcadfa4ac953935124895,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:75465aabb0aa427a5c531a8fcde463f6d119afbcc618ebcbf6b7ee9bc8aad160,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:dc18c8d6a4a9a0a574a57cc5082c8a9b26023bd6d69b9732892d584c1dfe5070,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:369729978cecdc13c99ef3d179f8eb8a450a4a0cb70b63c27a55a15d1710ba27,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:d8c7a61d147f62b204d5c5f16864386025393453c9a81ea327bbd25d7765d611,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:b4a6eb1cc118a4334b424614959d8b7f361ddd779b3a72690ca49b0a3f26d9b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:21d4fff670893ba4b7fbc528cd49f8b71c8281cede9ef84f0697065bb6a7fc50,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:12d9dbe297a1c3b9df671f21156992082bc483887d851fafe76e5d17321ff474,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:e65c37f04f6d76a0cbfe05edb3cddf6a8f14f859ee35cf3aebea8fcb991d2c19,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:48e4e178c6eeaa9d5dd77a591c185a311b4b4a5caadb7199d48463123e31dc9e,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2222c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-59bdc8b94-fqps4_openshift-operators(3247d37e-1277-411a-ad8b-ffcd6172206f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.130928 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" podUID="3247d37e-1277-411a-ad8b-ffcd6172206f" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.195407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de9ae8e-6adf-4a50-9482-0ca1d1691559-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.195497 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lfj\" (UniqueName: \"kubernetes.io/projected/6de9ae8e-6adf-4a50-9482-0ca1d1691559-kube-api-access-l8lfj\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.196249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de9ae8e-6adf-4a50-9482-0ca1d1691559-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.221037 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lfj\" (UniqueName: \"kubernetes.io/projected/6de9ae8e-6adf-4a50-9482-0ca1d1691559-kube-api-access-l8lfj\") pod \"cert-manager-operator-controller-manager-5446d6888b-zx7bw\" (UID: \"6de9ae8e-6adf-4a50-9482-0ca1d1691559\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.278575 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:2ecf763b02048d2cf4c17967a7b2cacc7afd6af0e963a39579d876f8f4170e3c\\\"\"" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" podUID="3247d37e-1277-411a-ad8b-ffcd6172206f" Jan 27 00:20:16 crc kubenswrapper[4774]: I0127 00:20:16.299639 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.612373 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.612665 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:interconnect-operator,Image:registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43,Command:[qdr-operator],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:60000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:qdr-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_QDROUTERD_IMAGE,Value:registry.redhat.io/amq7/amq-interconnect@sha256:31d87473fa684178a694f9ee331d3c80f2653f9533cb65c2a325752166a077e9,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:amq7-interconnect-operator.v1.10.20,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rzln8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod interconnect-operator-5bb49f789d-xcbvz_service-telemetry(f2a5d99b-17f5-4a46-958b-ae997b57245e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:20:16 crc kubenswrapper[4774]: E0127 00:20:16.617432 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" podUID="f2a5d99b-17f5-4a46-958b-ae997b57245e" Jan 27 00:20:17 crc kubenswrapper[4774]: E0127 00:20:17.289334 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43\\\"\"" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" podUID="f2a5d99b-17f5-4a46-958b-ae997b57245e" Jan 27 00:20:17 crc kubenswrapper[4774]: E0127 00:20:17.585163 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a" Jan 27 00:20:17 crc kubenswrapper[4774]: E0127 00:20:17.585446 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator --watch-referenced-objects-in-all-namespaces=true --disable-unmanaged-prometheus-configuration=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqdl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-68bc856cb9-qzvj5_openshift-operators(82ad6e88-a32b-4f4f-9a96-66d10c58a7d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 00:20:17 crc kubenswrapper[4774]: E0127 00:20:17.587041 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" podUID="82ad6e88-a32b-4f4f-9a96-66d10c58a7d9" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.616362 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.618157 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717024 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") pod \"bae6d967-d19c-4ab9-a2e2-21292d93389f\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717591 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") pod \"4c9c1faf-541d-4491-b37e-99909c74944b\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717631 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") pod \"4c9c1faf-541d-4491-b37e-99909c74944b\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717666 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") pod \"bae6d967-d19c-4ab9-a2e2-21292d93389f\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717696 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") pod \"bae6d967-d19c-4ab9-a2e2-21292d93389f\" (UID: \"bae6d967-d19c-4ab9-a2e2-21292d93389f\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.717801 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") pod \"4c9c1faf-541d-4491-b37e-99909c74944b\" (UID: \"4c9c1faf-541d-4491-b37e-99909c74944b\") " Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.719904 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities" (OuterVolumeSpecName: "utilities") pod "4c9c1faf-541d-4491-b37e-99909c74944b" (UID: "4c9c1faf-541d-4491-b37e-99909c74944b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.720285 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities" (OuterVolumeSpecName: "utilities") pod "bae6d967-d19c-4ab9-a2e2-21292d93389f" (UID: "bae6d967-d19c-4ab9-a2e2-21292d93389f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.736654 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx" (OuterVolumeSpecName: "kube-api-access-d8bxx") pod "bae6d967-d19c-4ab9-a2e2-21292d93389f" (UID: "bae6d967-d19c-4ab9-a2e2-21292d93389f"). InnerVolumeSpecName "kube-api-access-d8bxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.738260 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd" (OuterVolumeSpecName: "kube-api-access-tfmdd") pod "4c9c1faf-541d-4491-b37e-99909c74944b" (UID: "4c9c1faf-541d-4491-b37e-99909c74944b"). InnerVolumeSpecName "kube-api-access-tfmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.785284 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bae6d967-d19c-4ab9-a2e2-21292d93389f" (UID: "bae6d967-d19c-4ab9-a2e2-21292d93389f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.819746 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfmdd\" (UniqueName: \"kubernetes.io/projected/4c9c1faf-541d-4491-b37e-99909c74944b-kube-api-access-tfmdd\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.820157 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8bxx\" (UniqueName: \"kubernetes.io/projected/bae6d967-d19c-4ab9-a2e2-21292d93389f-kube-api-access-d8bxx\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.820211 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.820262 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.820328 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bae6d967-d19c-4ab9-a2e2-21292d93389f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:17 crc kubenswrapper[4774]: I0127 00:20:17.970515 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c9c1faf-541d-4491-b37e-99909c74944b" (UID: "4c9c1faf-541d-4491-b37e-99909c74944b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.022878 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c9c1faf-541d-4491-b37e-99909c74944b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.066136 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.099698 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw"] Jan 27 00:20:18 crc kubenswrapper[4774]: W0127 00:20:18.112540 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6de9ae8e_6adf_4a50_9482_0ca1d1691559.slice/crio-9954491a314d6210216d0fa6ed6038755753f29c88b7200855f539fcc46d4c54 WatchSource:0}: Error finding container 9954491a314d6210216d0fa6ed6038755753f29c88b7200855f539fcc46d4c54: Status 404 returned error can't find the container with id 9954491a314d6210216d0fa6ed6038755753f29c88b7200855f539fcc46d4c54 Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.123534 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") pod \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.123760 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") pod \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.123787 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") pod \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\" (UID: \"4afde62c-1f8e-4d6f-87ab-b4710b6c7158\") " Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.128182 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities" (OuterVolumeSpecName: "utilities") pod "4afde62c-1f8e-4d6f-87ab-b4710b6c7158" (UID: "4afde62c-1f8e-4d6f-87ab-b4710b6c7158"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.136900 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t" (OuterVolumeSpecName: "kube-api-access-8vs8t") pod "4afde62c-1f8e-4d6f-87ab-b4710b6c7158" (UID: "4afde62c-1f8e-4d6f-87ab-b4710b6c7158"). InnerVolumeSpecName "kube-api-access-8vs8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.219724 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4afde62c-1f8e-4d6f-87ab-b4710b6c7158" (UID: "4afde62c-1f8e-4d6f-87ab-b4710b6c7158"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.225683 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vs8t\" (UniqueName: \"kubernetes.io/projected/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-kube-api-access-8vs8t\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.225902 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.225961 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4afde62c-1f8e-4d6f-87ab-b4710b6c7158-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.299417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wztc" event={"ID":"4afde62c-1f8e-4d6f-87ab-b4710b6c7158","Type":"ContainerDied","Data":"f277365d77be95b43a841265074b0061aae099b70be4de813c5361a232b01168"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.299551 4774 scope.go:117] "RemoveContainer" containerID="c5bf743ef777224a76b80bf6d3d237f0094e6821c6315c5a72dc6d8fb0479220" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.299460 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wztc" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.302597 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" event={"ID":"96fc7bad-ed57-4110-afa8-9a6e5748c292","Type":"ContainerStarted","Data":"3ab50ec43d30b6cacbebffd3612e5698f2534b69b0e20c1acd6ba5b645b76e20"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.303530 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.310641 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qdczk" event={"ID":"bae6d967-d19c-4ab9-a2e2-21292d93389f","Type":"ContainerDied","Data":"c7f9a533937e391d5a101f3230a6fdf47cd308c1e2082bc57db248e561eaab18"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.310781 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qdczk" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.317573 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" event={"ID":"6de9ae8e-6adf-4a50-9482-0ca1d1691559","Type":"ContainerStarted","Data":"9954491a314d6210216d0fa6ed6038755753f29c88b7200855f539fcc46d4c54"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.326991 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" podStartSLOduration=3.146570146 podStartE2EDuration="22.326964242s" podCreationTimestamp="2026-01-27 00:19:56 +0000 UTC" firstStartedPulling="2026-01-27 00:19:58.462610948 +0000 UTC m=+776.768387832" lastFinishedPulling="2026-01-27 00:20:17.643005044 +0000 UTC m=+795.948781928" observedRunningTime="2026-01-27 00:20:18.323769646 +0000 UTC m=+796.629546530" watchObservedRunningTime="2026-01-27 00:20:18.326964242 +0000 UTC m=+796.632741126" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.329699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdxh5" event={"ID":"4c9c1faf-541d-4491-b37e-99909c74944b","Type":"ContainerDied","Data":"23a340b31e207dc1319a830ec2ed78f64feba461f1f191f536b89271a9b624f9"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.329817 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdxh5" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.337585 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" event={"ID":"4ac18775-726a-43da-a184-dfd1565544f1","Type":"ContainerStarted","Data":"bee7ecc2ff4418719abb4e0308ad2db20bcdc6e3438cdb67c4b1f924f2d71375"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.340676 4774 scope.go:117] "RemoveContainer" containerID="6adb30d15bae1199f3712c4a8352afcec6a4c2a58544b7c28ed4b9cf80a6f3bb" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.343174 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" event={"ID":"dfee63d3-9a5d-46f6-b984-78d6a837e20c","Type":"ContainerStarted","Data":"4600c5cb3c4f9b4561588557caae5830a0960b54703cbf258ac4f048be39f6b6"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.347910 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29"} Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.350838 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" event={"ID":"e6700f3e-4423-48fe-94ae-562483cf3a18","Type":"ContainerStarted","Data":"b810bc6a18801ba49e5d9a6f3beaae9060508746bf13d60cff6840025efc02f3"} Jan 27 00:20:18 crc kubenswrapper[4774]: E0127 00:20:18.351793 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a\\\"\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" podUID="82ad6e88-a32b-4f4f-9a96-66d10c58a7d9" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.359944 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.377829 4774 scope.go:117] "RemoveContainer" containerID="e02d219daaca4ec67fc454d718eec7db071fdd5266115161621d49cc1c997588" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.384777 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9wztc"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.425646 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-548dl" podStartSLOduration=2.233365354 podStartE2EDuration="23.425628506s" podCreationTimestamp="2026-01-27 00:19:55 +0000 UTC" firstStartedPulling="2026-01-27 00:19:56.45067862 +0000 UTC m=+774.756455504" lastFinishedPulling="2026-01-27 00:20:17.642941772 +0000 UTC m=+795.948718656" observedRunningTime="2026-01-27 00:20:18.394387052 +0000 UTC m=+796.700163946" watchObservedRunningTime="2026-01-27 00:20:18.425628506 +0000 UTC m=+796.731405390" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.432337 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.433473 4774 scope.go:117] "RemoveContainer" containerID="9a28eff5a79e321ec675a9fd59c9340dfabc3ec97580e7267958385db2dd9180" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.436719 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pdxh5"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.476580 4774 scope.go:117] "RemoveContainer" containerID="200e6d7e4a8ae08ab9f21528d73e613a734eef3b1ea5439197e326a0aac07a73" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.511747 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6495c554dc-2b75r" podStartSLOduration=2.584897758 podStartE2EDuration="23.511724664s" podCreationTimestamp="2026-01-27 00:19:55 +0000 UTC" firstStartedPulling="2026-01-27 00:19:56.63085072 +0000 UTC m=+774.936627604" lastFinishedPulling="2026-01-27 00:20:17.557677606 +0000 UTC m=+795.863454510" observedRunningTime="2026-01-27 00:20:18.510555703 +0000 UTC m=+796.816332587" watchObservedRunningTime="2026-01-27 00:20:18.511724664 +0000 UTC m=+796.817501558" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.531807 4774 scope.go:117] "RemoveContainer" containerID="4c26c2e2994e8be6a1fd5545240fd4ba414fbf6378b30c5235cd400317e57ad6" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.607344 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-688b5775f4-bvrcz" podStartSLOduration=2.255509023 podStartE2EDuration="19.607321986s" podCreationTimestamp="2026-01-27 00:19:59 +0000 UTC" firstStartedPulling="2026-01-27 00:20:00.281703058 +0000 UTC m=+778.587479942" lastFinishedPulling="2026-01-27 00:20:17.633516021 +0000 UTC m=+795.939292905" observedRunningTime="2026-01-27 00:20:18.60673339 +0000 UTC m=+796.912510284" watchObservedRunningTime="2026-01-27 00:20:18.607321986 +0000 UTC m=+796.913098880" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.629502 4774 scope.go:117] "RemoveContainer" containerID="27982448ccecb62222446de28aefcec6ea7c21a2865d0158d3821ae0e47be1a3" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.671153 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.675020 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qdczk"] Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.692987 4774 scope.go:117] "RemoveContainer" containerID="92dbedab7e3a765202cdab5c278b77c503ff9fdad67f1ad130701e4b0d133e1e" Jan 27 00:20:18 crc kubenswrapper[4774]: I0127 00:20:18.720034 4774 scope.go:117] "RemoveContainer" containerID="44bf8961053363986c01a6554172bfd8b05751670ed90f16bf1e2f86b2ba58dc" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.436449 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438115 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438182 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438239 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438285 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438381 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438434 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438483 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438529 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438615 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438672 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.438764 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.438934 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="extract-content" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.439005 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439052 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.439127 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439178 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: E0127 00:20:19.439272 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439324 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="extract-utilities" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439516 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439581 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.439629 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" containerName="registry-server" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.441653 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444214 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444413 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444437 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444653 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444741 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444824 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.444898 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.445223 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-z65l4" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.445523 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.453425 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579214 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579769 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579796 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579821 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/b4f320b3-4c9f-433a-943a-8f2934061b87-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.579846 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580060 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580181 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580234 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580376 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580560 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580627 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580708 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.580733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682383 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682440 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682472 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682495 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682520 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682549 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682571 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682592 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682615 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/b4f320b3-4c9f-433a-943a-8f2934061b87-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682657 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682678 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682698 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682725 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.682745 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684090 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684120 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684474 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684552 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.684880 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.685353 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/b4f320b3-4c9f-433a-943a-8f2934061b87-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.685677 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.700874 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.703439 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.706918 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.708193 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.718496 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.722783 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/b4f320b3-4c9f-433a-943a-8f2934061b87-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.740461 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/b4f320b3-4c9f-433a-943a-8f2934061b87-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"b4f320b3-4c9f-433a-943a-8f2934061b87\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:19 crc kubenswrapper[4774]: I0127 00:20:19.762214 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.219981 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.364352 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4afde62c-1f8e-4d6f-87ab-b4710b6c7158" path="/var/lib/kubelet/pods/4afde62c-1f8e-4d6f-87ab-b4710b6c7158/volumes" Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.365384 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c9c1faf-541d-4491-b37e-99909c74944b" path="/var/lib/kubelet/pods/4c9c1faf-541d-4491-b37e-99909c74944b/volumes" Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.366160 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae6d967-d19c-4ab9-a2e2-21292d93389f" path="/var/lib/kubelet/pods/bae6d967-d19c-4ab9-a2e2-21292d93389f/volumes" Jan 27 00:20:20 crc kubenswrapper[4774]: I0127 00:20:20.392353 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerStarted","Data":"3f5de3ffe8938d560ac5c34306502ab0706452284e8018d6fc6f6c64b1f944d9"} Jan 27 00:20:26 crc kubenswrapper[4774]: I0127 00:20:26.481064 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-b7pt6" Jan 27 00:20:32 crc kubenswrapper[4774]: I0127 00:20:32.492765 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" event={"ID":"f2a5d99b-17f5-4a46-958b-ae997b57245e","Type":"ContainerStarted","Data":"55e9c58d67bcf7565f4e2a76c9849b0567ed678bfcabf2a5096249212063644e"} Jan 27 00:20:32 crc kubenswrapper[4774]: I0127 00:20:32.495851 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" event={"ID":"6de9ae8e-6adf-4a50-9482-0ca1d1691559","Type":"ContainerStarted","Data":"30176e87a0c7738e220da2d37b7d3b9a6b192f87f7b55eca9cac835e7b2d763e"} Jan 27 00:20:32 crc kubenswrapper[4774]: I0127 00:20:32.523207 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-xcbvz" podStartSLOduration=2.9097625049999998 podStartE2EDuration="36.52318945s" podCreationTimestamp="2026-01-27 00:19:56 +0000 UTC" firstStartedPulling="2026-01-27 00:19:58.690528242 +0000 UTC m=+776.996305126" lastFinishedPulling="2026-01-27 00:20:32.303955187 +0000 UTC m=+810.609732071" observedRunningTime="2026-01-27 00:20:32.519573463 +0000 UTC m=+810.825350357" watchObservedRunningTime="2026-01-27 00:20:32.52318945 +0000 UTC m=+810.828966334" Jan 27 00:20:32 crc kubenswrapper[4774]: I0127 00:20:32.553587 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-zx7bw" podStartSLOduration=7.113680386 podStartE2EDuration="17.553572941s" podCreationTimestamp="2026-01-27 00:20:15 +0000 UTC" firstStartedPulling="2026-01-27 00:20:18.118286312 +0000 UTC m=+796.424063196" lastFinishedPulling="2026-01-27 00:20:28.558178877 +0000 UTC m=+806.863955751" observedRunningTime="2026-01-27 00:20:32.547993873 +0000 UTC m=+810.853770767" watchObservedRunningTime="2026-01-27 00:20:32.553572941 +0000 UTC m=+810.859349825" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.401073 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-gg42r"] Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.402478 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.406241 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.408608 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.421033 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-gg42r"] Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.574988 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2jb\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-kube-api-access-xz2jb\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.575101 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.677444 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2jb\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-kube-api-access-xz2jb\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.677581 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.719738 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:35 crc kubenswrapper[4774]: I0127 00:20:35.720469 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2jb\" (UniqueName: \"kubernetes.io/projected/32c71eef-359e-42d1-a9c8-d0f392ecabaa-kube-api-access-xz2jb\") pod \"cert-manager-webhook-f4fb5df64-gg42r\" (UID: \"32c71eef-359e-42d1-a9c8-d0f392ecabaa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:36 crc kubenswrapper[4774]: I0127 00:20:36.018390 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:36 crc kubenswrapper[4774]: I0127 00:20:36.295278 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-gg42r"] Jan 27 00:20:36 crc kubenswrapper[4774]: I0127 00:20:36.522307 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" event={"ID":"32c71eef-359e-42d1-a9c8-d0f392ecabaa","Type":"ContainerStarted","Data":"3f0a8ca9eaa5f3e751d24842cda55b2d88f42fca268c52b199efd3ddd35b5457"} Jan 27 00:20:38 crc kubenswrapper[4774]: I0127 00:20:38.535990 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerStarted","Data":"e1c28ef26536e4b801203ccfc64b75e994e7aa6d50dc91a3221cd5034177d536"} Jan 27 00:20:38 crc kubenswrapper[4774]: I0127 00:20:38.537325 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" event={"ID":"82ad6e88-a32b-4f4f-9a96-66d10c58a7d9","Type":"ContainerStarted","Data":"a571534328c7c3da2d9655ac739f8026e17b47f3e16424be24dbd388ac1b7422"} Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.562006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" event={"ID":"3247d37e-1277-411a-ad8b-ffcd6172206f","Type":"ContainerStarted","Data":"e6d6567f12a4bc63a0ce4a43c029910c706c61db654ed91c528debe4b527127e"} Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.562909 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.566225 4774 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-fqps4 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/healthz\": dial tcp 10.217.0.48:8081: connect: connection refused" start-of-body= Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.566308 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" podUID="3247d37e-1277-411a-ad8b-ffcd6172206f" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.48:8081/healthz\": dial tcp 10.217.0.48:8081: connect: connection refused" Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.619443 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qzvj5" podStartSLOduration=8.925443095 podStartE2EDuration="44.619422039s" podCreationTimestamp="2026-01-27 00:19:55 +0000 UTC" firstStartedPulling="2026-01-27 00:19:56.451029471 +0000 UTC m=+774.756806355" lastFinishedPulling="2026-01-27 00:20:32.145008425 +0000 UTC m=+810.450785299" observedRunningTime="2026-01-27 00:20:39.615103163 +0000 UTC m=+817.920880057" watchObservedRunningTime="2026-01-27 00:20:39.619422039 +0000 UTC m=+817.925198923" Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.637692 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" podStartSLOduration=2.128552186 podStartE2EDuration="44.637665686s" podCreationTimestamp="2026-01-27 00:19:55 +0000 UTC" firstStartedPulling="2026-01-27 00:19:56.631181149 +0000 UTC m=+774.936958023" lastFinishedPulling="2026-01-27 00:20:39.140294639 +0000 UTC m=+817.446071523" observedRunningTime="2026-01-27 00:20:39.63261657 +0000 UTC m=+817.938393484" watchObservedRunningTime="2026-01-27 00:20:39.637665686 +0000 UTC m=+817.943442580" Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.792945 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:39 crc kubenswrapper[4774]: I0127 00:20:39.835627 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 00:20:40 crc kubenswrapper[4774]: I0127 00:20:40.570015 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-fqps4" Jan 27 00:20:41 crc kubenswrapper[4774]: I0127 00:20:41.582342 4774 generic.go:334] "Generic (PLEG): container finished" podID="b4f320b3-4c9f-433a-943a-8f2934061b87" containerID="e1c28ef26536e4b801203ccfc64b75e994e7aa6d50dc91a3221cd5034177d536" exitCode=0 Jan 27 00:20:41 crc kubenswrapper[4774]: I0127 00:20:41.582437 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerDied","Data":"e1c28ef26536e4b801203ccfc64b75e994e7aa6d50dc91a3221cd5034177d536"} Jan 27 00:20:42 crc kubenswrapper[4774]: I0127 00:20:42.598848 4774 generic.go:334] "Generic (PLEG): container finished" podID="b4f320b3-4c9f-433a-943a-8f2934061b87" containerID="2cdca33b20fa2d3239d49076e22a71d3376abb0212bea8ffe215758cdb461050" exitCode=0 Jan 27 00:20:42 crc kubenswrapper[4774]: I0127 00:20:42.599027 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerDied","Data":"2cdca33b20fa2d3239d49076e22a71d3376abb0212bea8ffe215758cdb461050"} Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.623098 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6"] Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.624306 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.626395 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-knclw" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.640695 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6"] Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.809176 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.809668 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flxf9\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-kube-api-access-flxf9\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.910835 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.910909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flxf9\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-kube-api-access-flxf9\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.939907 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flxf9\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-kube-api-access-flxf9\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:44 crc kubenswrapper[4774]: I0127 00:20:44.955434 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84e34f19-48c2-4096-83e7-95b9fd16a1e6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-hp7q6\" (UID: \"84e34f19-48c2-4096-83e7-95b9fd16a1e6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:45 crc kubenswrapper[4774]: I0127 00:20:45.239540 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.385216 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6"] Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.657574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"b4f320b3-4c9f-433a-943a-8f2934061b87","Type":"ContainerStarted","Data":"e9c15ad3203348e76f2c44f83792b6fbe61c8d02f239c5791fb1ac2a48d30915"} Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.658203 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.660538 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" event={"ID":"32c71eef-359e-42d1-a9c8-d0f392ecabaa","Type":"ContainerStarted","Data":"a591c674af2565d2a8fcf2c7062d7c4e51e5dffac8cf0cf285e0834a15dbe116"} Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.660623 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.662314 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" event={"ID":"84e34f19-48c2-4096-83e7-95b9fd16a1e6","Type":"ContainerStarted","Data":"f17ccc67d5046a300871dda616003a877a1195a908d7a53ae8d2007bf5cf3931"} Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.662414 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" event={"ID":"84e34f19-48c2-4096-83e7-95b9fd16a1e6","Type":"ContainerStarted","Data":"d8c598429154924b660f5b84216ce367590333a4f519c51d7b7036a7d3777657"} Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.699470 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=17.524848111 podStartE2EDuration="29.699448283s" podCreationTimestamp="2026-01-27 00:20:19 +0000 UTC" firstStartedPulling="2026-01-27 00:20:20.235739005 +0000 UTC m=+798.541515889" lastFinishedPulling="2026-01-27 00:20:32.410339177 +0000 UTC m=+810.716116061" observedRunningTime="2026-01-27 00:20:48.698768535 +0000 UTC m=+827.004545419" watchObservedRunningTime="2026-01-27 00:20:48.699448283 +0000 UTC m=+827.005225187" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.713549 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-hp7q6" podStartSLOduration=4.713522429 podStartE2EDuration="4.713522429s" podCreationTimestamp="2026-01-27 00:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:20:48.711841654 +0000 UTC m=+827.017618538" watchObservedRunningTime="2026-01-27 00:20:48.713522429 +0000 UTC m=+827.019299313" Jan 27 00:20:48 crc kubenswrapper[4774]: I0127 00:20:48.742521 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" podStartSLOduration=1.8219873930000001 podStartE2EDuration="13.742497433s" podCreationTimestamp="2026-01-27 00:20:35 +0000 UTC" firstStartedPulling="2026-01-27 00:20:36.307744416 +0000 UTC m=+814.613521300" lastFinishedPulling="2026-01-27 00:20:48.228254456 +0000 UTC m=+826.534031340" observedRunningTime="2026-01-27 00:20:48.737109909 +0000 UTC m=+827.042886803" watchObservedRunningTime="2026-01-27 00:20:48.742497433 +0000 UTC m=+827.048274317" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.268565 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9mmc4"] Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.269362 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.271693 4774 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pnb79" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.295933 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb98j\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-kube-api-access-sb98j\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.295996 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-bound-sa-token\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.296136 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9mmc4"] Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.397801 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb98j\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-kube-api-access-sb98j\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.397884 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-bound-sa-token\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.439714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-bound-sa-token\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.441194 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb98j\" (UniqueName: \"kubernetes.io/projected/0a5e3cb3-d23f-405b-bb88-18159ee24067-kube-api-access-sb98j\") pod \"cert-manager-86cb77c54b-9mmc4\" (UID: \"0a5e3cb3-d23f-405b-bb88-18159ee24067\") " pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:49 crc kubenswrapper[4774]: I0127 00:20:49.585979 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" Jan 27 00:20:50 crc kubenswrapper[4774]: I0127 00:20:50.061971 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-9mmc4"] Jan 27 00:20:50 crc kubenswrapper[4774]: W0127 00:20:50.067667 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a5e3cb3_d23f_405b_bb88_18159ee24067.slice/crio-0e6dc7a8d6f8a5e9d781f0edf8c2af1ebab705bfc12e7c81a1d525f71b3bb6e9 WatchSource:0}: Error finding container 0e6dc7a8d6f8a5e9d781f0edf8c2af1ebab705bfc12e7c81a1d525f71b3bb6e9: Status 404 returned error can't find the container with id 0e6dc7a8d6f8a5e9d781f0edf8c2af1ebab705bfc12e7c81a1d525f71b3bb6e9 Jan 27 00:20:50 crc kubenswrapper[4774]: I0127 00:20:50.689576 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" event={"ID":"0a5e3cb3-d23f-405b-bb88-18159ee24067","Type":"ContainerStarted","Data":"f6f2826e7ba0d1ce7cd4913ae053b4ee300ca2c5cd151a0ad737743c6a76ef60"} Jan 27 00:20:50 crc kubenswrapper[4774]: I0127 00:20:50.689655 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" event={"ID":"0a5e3cb3-d23f-405b-bb88-18159ee24067","Type":"ContainerStarted","Data":"0e6dc7a8d6f8a5e9d781f0edf8c2af1ebab705bfc12e7c81a1d525f71b3bb6e9"} Jan 27 00:20:50 crc kubenswrapper[4774]: I0127 00:20:50.713774 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-9mmc4" podStartSLOduration=1.713753404 podStartE2EDuration="1.713753404s" podCreationTimestamp="2026-01-27 00:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:20:50.706347205 +0000 UTC m=+829.012124089" watchObservedRunningTime="2026-01-27 00:20:50.713753404 +0000 UTC m=+829.019530288" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.495328 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.496917 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.502014 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.502205 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.502014 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.503055 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540295 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540420 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540471 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540513 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540574 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.540628 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542588 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542887 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.542947 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.545415 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644721 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644768 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644789 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644833 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644853 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644906 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644925 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644947 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.644979 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.645017 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.645290 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.645364 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.645641 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646056 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646120 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646248 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646336 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646442 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.646752 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.650549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.651698 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.662167 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") pod \"service-telemetry-operator-1-build\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:51 crc kubenswrapper[4774]: I0127 00:20:51.812347 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:20:52 crc kubenswrapper[4774]: I0127 00:20:52.077657 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:20:52 crc kubenswrapper[4774]: W0127 00:20:52.103156 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod357826c8_1161_4d5b_9f9b_f1b432b59bb4.slice/crio-f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac WatchSource:0}: Error finding container f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac: Status 404 returned error can't find the container with id f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac Jan 27 00:20:52 crc kubenswrapper[4774]: I0127 00:20:52.707445 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerStarted","Data":"f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac"} Jan 27 00:20:56 crc kubenswrapper[4774]: I0127 00:20:56.026174 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-gg42r" Jan 27 00:20:58 crc kubenswrapper[4774]: I0127 00:20:58.751174 4774 generic.go:334] "Generic (PLEG): container finished" podID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerID="9c4f05efee4df97958a36a2ceae5c08933e7191a90c105019b63116157393129" exitCode=0 Jan 27 00:20:58 crc kubenswrapper[4774]: I0127 00:20:58.751265 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerDied","Data":"9c4f05efee4df97958a36a2ceae5c08933e7191a90c105019b63116157393129"} Jan 27 00:20:59 crc kubenswrapper[4774]: I0127 00:20:59.760963 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerStarted","Data":"eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b"} Jan 27 00:20:59 crc kubenswrapper[4774]: I0127 00:20:59.811107 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=2.81807664 podStartE2EDuration="8.811064209s" podCreationTimestamp="2026-01-27 00:20:51 +0000 UTC" firstStartedPulling="2026-01-27 00:20:52.113360115 +0000 UTC m=+830.419137009" lastFinishedPulling="2026-01-27 00:20:58.106347694 +0000 UTC m=+836.412124578" observedRunningTime="2026-01-27 00:20:59.803449037 +0000 UTC m=+838.109226001" watchObservedRunningTime="2026-01-27 00:20:59.811064209 +0000 UTC m=+838.116841143" Jan 27 00:20:59 crc kubenswrapper[4774]: I0127 00:20:59.916501 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="b4f320b3-4c9f-433a-943a-8f2934061b87" containerName="elasticsearch" probeResult="failure" output=< Jan 27 00:20:59 crc kubenswrapper[4774]: {"timestamp": "2026-01-27T00:20:59+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 27 00:20:59 crc kubenswrapper[4774]: > Jan 27 00:21:01 crc kubenswrapper[4774]: I0127 00:21:01.571772 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:21:01 crc kubenswrapper[4774]: I0127 00:21:01.775447 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="docker-build" containerID="cri-o://eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b" gracePeriod=30 Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.168382 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.171516 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.173521 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.173774 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.174161 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.201669 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.226935 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227051 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227105 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227139 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227177 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227209 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227276 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227326 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227363 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227406 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227440 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.227483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334309 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334379 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334434 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334463 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334508 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334534 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334561 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334605 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334631 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.334673 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335018 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335495 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335523 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335591 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335930 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.335975 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.336171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.336420 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.340721 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.341487 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.342918 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.352920 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") pod \"service-telemetry-operator-2-build\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:03 crc kubenswrapper[4774]: I0127 00:21:03.540980 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:21:04 crc kubenswrapper[4774]: I0127 00:21:04.563365 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 00:21:04 crc kubenswrapper[4774]: I0127 00:21:04.800478 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerStarted","Data":"5bbb584405fb7631a8a17fc2b018cadebd4020bea20a1f59c42abfddafe4d91f"} Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.086388 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.815784 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_357826c8-1161-4d5b-9f9b-f1b432b59bb4/docker-build/0.log" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.818761 4774 generic.go:334] "Generic (PLEG): container finished" podID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerID="eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b" exitCode=1 Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.818968 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerDied","Data":"eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b"} Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.821916 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerStarted","Data":"65bae08af99d67fd59ec59dbc59a548eb6b9f634c7a63b5b18d30048a6914234"} Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.965901 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_357826c8-1161-4d5b-9f9b-f1b432b59bb4/docker-build/0.log" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.966762 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.981882 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.981958 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.981988 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982070 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982066 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982130 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982157 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982496 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982849 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.982995 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983075 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983739 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983893 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983924 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.983988 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984023 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984068 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") pod \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\" (UID: \"357826c8-1161-4d5b-9f9b-f1b432b59bb4\") " Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984322 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984427 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984714 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984837 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984852 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984887 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984896 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357826c8-1161-4d5b-9f9b-f1b432b59bb4-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984906 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984916 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984924 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.984980 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.992163 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.996081 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg" (OuterVolumeSpecName: "kube-api-access-wx9bg") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "kube-api-access-wx9bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:21:05 crc kubenswrapper[4774]: I0127 00:21:05.996117 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "357826c8-1161-4d5b-9f9b-f1b432b59bb4" (UID: "357826c8-1161-4d5b-9f9b-f1b432b59bb4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086567 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086603 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx9bg\" (UniqueName: \"kubernetes.io/projected/357826c8-1161-4d5b-9f9b-f1b432b59bb4-kube-api-access-wx9bg\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086615 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086627 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/357826c8-1161-4d5b-9f9b-f1b432b59bb4-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.086638 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/357826c8-1161-4d5b-9f9b-f1b432b59bb4-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.829717 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_357826c8-1161-4d5b-9f9b-f1b432b59bb4/docker-build/0.log" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.830507 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"357826c8-1161-4d5b-9f9b-f1b432b59bb4","Type":"ContainerDied","Data":"f668e73eccb9ae027260398cd4c656b2c1c4645de8d3889a343b6e8e287f7bac"} Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.830518 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.830582 4774 scope.go:117] "RemoveContainer" containerID="eb1dad4864af4f00eec4bc1736ac8585f605d9937ad1e15a146624a4cb19819b" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.854956 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.856469 4774 scope.go:117] "RemoveContainer" containerID="9c4f05efee4df97958a36a2ceae5c08933e7191a90c105019b63116157393129" Jan 27 00:21:06 crc kubenswrapper[4774]: I0127 00:21:06.862102 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 00:21:08 crc kubenswrapper[4774]: I0127 00:21:08.364077 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" path="/var/lib/kubelet/pods/357826c8-1161-4d5b-9f9b-f1b432b59bb4/volumes" Jan 27 00:21:12 crc kubenswrapper[4774]: I0127 00:21:12.875111 4774 generic.go:334] "Generic (PLEG): container finished" podID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerID="65bae08af99d67fd59ec59dbc59a548eb6b9f634c7a63b5b18d30048a6914234" exitCode=0 Jan 27 00:21:12 crc kubenswrapper[4774]: I0127 00:21:12.875196 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerDied","Data":"65bae08af99d67fd59ec59dbc59a548eb6b9f634c7a63b5b18d30048a6914234"} Jan 27 00:21:13 crc kubenswrapper[4774]: I0127 00:21:13.893571 4774 generic.go:334] "Generic (PLEG): container finished" podID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerID="324a3bc382f8ead3e05849a9fd24d9f5c0bfb892f82bc846544ebd46627c5a36" exitCode=0 Jan 27 00:21:13 crc kubenswrapper[4774]: I0127 00:21:13.893642 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerDied","Data":"324a3bc382f8ead3e05849a9fd24d9f5c0bfb892f82bc846544ebd46627c5a36"} Jan 27 00:21:13 crc kubenswrapper[4774]: I0127 00:21:13.944060 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_e97d63e3-09a4-4cd1-9019-b10dfb8d2d48/manage-dockerfile/0.log" Jan 27 00:21:14 crc kubenswrapper[4774]: I0127 00:21:14.905964 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerStarted","Data":"c6b52da603998873d5e158aa465c1e4ee9dee4ac321b833341a4872c06415c80"} Jan 27 00:21:14 crc kubenswrapper[4774]: I0127 00:21:14.955384 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=11.955364512 podStartE2EDuration="11.955364512s" podCreationTimestamp="2026-01-27 00:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:21:14.951400276 +0000 UTC m=+853.257177160" watchObservedRunningTime="2026-01-27 00:21:14.955364512 +0000 UTC m=+853.261141406" Jan 27 00:22:35 crc kubenswrapper[4774]: I0127 00:22:35.482631 4774 generic.go:334] "Generic (PLEG): container finished" podID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerID="c6b52da603998873d5e158aa465c1e4ee9dee4ac321b833341a4872c06415c80" exitCode=0 Jan 27 00:22:35 crc kubenswrapper[4774]: I0127 00:22:35.482659 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerDied","Data":"c6b52da603998873d5e158aa465c1e4ee9dee4ac321b833341a4872c06415c80"} Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.675633 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.675701 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.797618 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925162 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925236 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925283 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925328 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925363 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925446 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925545 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925619 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925727 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.925945 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926091 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926245 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926381 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") pod \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\" (UID: \"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48\") " Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926291 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926558 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.926655 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927128 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927155 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927167 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927176 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.927185 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.928628 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.933786 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.934762 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.935776 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf" (OuterVolumeSpecName: "kube-api-access-jh4cf") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "kube-api-access-jh4cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:22:36 crc kubenswrapper[4774]: I0127 00:22:36.966705 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029312 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029395 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029409 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh4cf\" (UniqueName: \"kubernetes.io/projected/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-kube-api-access-jh4cf\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029422 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.029433 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.131710 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.234433 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.503372 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"e97d63e3-09a4-4cd1-9019-b10dfb8d2d48","Type":"ContainerDied","Data":"5bbb584405fb7631a8a17fc2b018cadebd4020bea20a1f59c42abfddafe4d91f"} Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.503755 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bbb584405fb7631a8a17fc2b018cadebd4020bea20a1f59c42abfddafe4d91f" Jan 27 00:22:37 crc kubenswrapper[4774]: I0127 00:22:37.503518 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 00:22:39 crc kubenswrapper[4774]: I0127 00:22:39.012234 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" (UID: "e97d63e3-09a4-4cd1-9019-b10dfb8d2d48"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:39 crc kubenswrapper[4774]: I0127 00:22:39.063184 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e97d63e3-09a4-4cd1-9019-b10dfb8d2d48-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.791390 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792013 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="git-clone" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792026 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="git-clone" Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792037 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792045 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792058 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="manage-dockerfile" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792064 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="manage-dockerfile" Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792074 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792079 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: E0127 00:22:41.792087 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="manage-dockerfile" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792093 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="manage-dockerfile" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792193 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="357826c8-1161-4d5b-9f9b-f1b432b59bb4" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792208 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e97d63e3-09a4-4cd1-9019-b10dfb8d2d48" containerName="docker-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.792882 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.795880 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.796140 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.796329 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.796470 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.811973 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902649 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902692 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902720 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902748 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902765 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902793 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902812 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902834 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902875 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902893 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902915 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:41 crc kubenswrapper[4774]: I0127 00:22:41.902929 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004564 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004622 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004709 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004734 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004764 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004788 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004828 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004877 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004942 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.004963 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005225 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005279 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005416 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005705 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.005747 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.006990 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.007257 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.007594 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.012734 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.015205 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.021944 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") pod \"smart-gateway-operator-1-build\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.111109 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.335729 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:42 crc kubenswrapper[4774]: I0127 00:22:42.542738 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerStarted","Data":"efbb54a015404b8967513a09f9917177f0d8854494fa4a5dbf69d8215e48a96a"} Jan 27 00:22:43 crc kubenswrapper[4774]: I0127 00:22:43.553723 4774 generic.go:334] "Generic (PLEG): container finished" podID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerID="6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f" exitCode=0 Jan 27 00:22:43 crc kubenswrapper[4774]: I0127 00:22:43.553784 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerDied","Data":"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f"} Jan 27 00:22:44 crc kubenswrapper[4774]: I0127 00:22:44.566081 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerStarted","Data":"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20"} Jan 27 00:22:44 crc kubenswrapper[4774]: I0127 00:22:44.604268 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.604240004 podStartE2EDuration="3.604240004s" podCreationTimestamp="2026-01-27 00:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:22:44.602731324 +0000 UTC m=+942.908508238" watchObservedRunningTime="2026-01-27 00:22:44.604240004 +0000 UTC m=+942.910016888" Jan 27 00:22:52 crc kubenswrapper[4774]: I0127 00:22:52.313372 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:52 crc kubenswrapper[4774]: I0127 00:22:52.314777 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="docker-build" containerID="cri-o://365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" gracePeriod=30 Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.265151 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_186f6989-c0d9-41b8-9036-6f5f74966b06/docker-build/0.log" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.266260 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407018 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407140 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407173 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407256 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407243 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407299 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407334 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407367 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407281 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407423 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407471 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407525 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407550 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") pod \"186f6989-c0d9-41b8-9036-6f5f74966b06\" (UID: \"186f6989-c0d9-41b8-9036-6f5f74966b06\") " Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407942 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.407958 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/186f6989-c0d9-41b8-9036-6f5f74966b06-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.408385 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.408466 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.409029 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.409078 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.409468 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.414402 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.415747 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6" (OuterVolumeSpecName: "kube-api-access-kdtr6") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "kube-api-access-kdtr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.417732 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509528 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509571 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdtr6\" (UniqueName: \"kubernetes.io/projected/186f6989-c0d9-41b8-9036-6f5f74966b06-kube-api-access-kdtr6\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509583 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509596 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509611 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509623 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509750 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/186f6989-c0d9-41b8-9036-6f5f74966b06-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.509763 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/186f6989-c0d9-41b8-9036-6f5f74966b06-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.605120 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.611664 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.643661 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_186f6989-c0d9-41b8-9036-6f5f74966b06/docker-build/0.log" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.644692 4774 generic.go:334] "Generic (PLEG): container finished" podID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerID="365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" exitCode=1 Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.644807 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.644783 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerDied","Data":"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20"} Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.645094 4774 scope.go:117] "RemoveContainer" containerID="365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.645023 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"186f6989-c0d9-41b8-9036-6f5f74966b06","Type":"ContainerDied","Data":"efbb54a015404b8967513a09f9917177f0d8854494fa4a5dbf69d8215e48a96a"} Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.746635 4774 scope.go:117] "RemoveContainer" containerID="6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.779608 4774 scope.go:117] "RemoveContainer" containerID="365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" Jan 27 00:22:53 crc kubenswrapper[4774]: E0127 00:22:53.780256 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20\": container with ID starting with 365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20 not found: ID does not exist" containerID="365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.780362 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20"} err="failed to get container status \"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20\": rpc error: code = NotFound desc = could not find container \"365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20\": container with ID starting with 365438942be10c8e2f2deaf13dc3408742b815eca417fdf2b3f417751be1ff20 not found: ID does not exist" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.780462 4774 scope.go:117] "RemoveContainer" containerID="6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f" Jan 27 00:22:53 crc kubenswrapper[4774]: E0127 00:22:53.780870 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f\": container with ID starting with 6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f not found: ID does not exist" containerID="6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.780902 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f"} err="failed to get container status \"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f\": rpc error: code = NotFound desc = could not find container \"6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f\": container with ID starting with 6cc4b94a7e8a0d14a173c7898dbbbcd4967f76fb50b07b8ec55001abc1413d8f not found: ID does not exist" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.845921 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "186f6989-c0d9-41b8-9036-6f5f74966b06" (UID: "186f6989-c0d9-41b8-9036-6f5f74966b06"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.925524 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/186f6989-c0d9-41b8-9036-6f5f74966b06-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.931097 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Jan 27 00:22:53 crc kubenswrapper[4774]: E0127 00:22:53.931525 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="manage-dockerfile" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.931560 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="manage-dockerfile" Jan 27 00:22:53 crc kubenswrapper[4774]: E0127 00:22:53.931602 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="docker-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.931622 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="docker-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.931920 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" containerName="docker-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.933549 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.937961 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.937988 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.938395 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Jan 27 00:22:53 crc kubenswrapper[4774]: I0127 00:22:53.964433 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.012119 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.026216 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027203 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027451 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027488 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027651 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027736 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027765 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027799 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027831 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.027971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.028068 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.028122 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130541 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130654 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130705 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130772 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130847 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130916 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.130961 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131006 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131078 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131126 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131160 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131179 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131429 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131227 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131832 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.131939 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.132029 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.132197 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.132522 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.133059 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.133134 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.136534 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.137985 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.155366 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") pod \"smart-gateway-operator-2-build\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.266093 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.372553 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186f6989-c0d9-41b8-9036-6f5f74966b06" path="/var/lib/kubelet/pods/186f6989-c0d9-41b8-9036-6f5f74966b06/volumes" Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.551843 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Jan 27 00:22:54 crc kubenswrapper[4774]: I0127 00:22:54.659456 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerStarted","Data":"6ba529326fadbe0cbf460b5a864b9676a02a66194862dcf2709f9808afc41984"} Jan 27 00:22:55 crc kubenswrapper[4774]: I0127 00:22:55.675061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerStarted","Data":"9fa2ef5e46f34f2f8bee5eebee95b472ff3968fc4432383de2683e43e9110407"} Jan 27 00:22:56 crc kubenswrapper[4774]: I0127 00:22:56.686986 4774 generic.go:334] "Generic (PLEG): container finished" podID="1c7b5d79-5dff-4a72-9864-22416e852110" containerID="9fa2ef5e46f34f2f8bee5eebee95b472ff3968fc4432383de2683e43e9110407" exitCode=0 Jan 27 00:22:56 crc kubenswrapper[4774]: I0127 00:22:56.687113 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerDied","Data":"9fa2ef5e46f34f2f8bee5eebee95b472ff3968fc4432383de2683e43e9110407"} Jan 27 00:22:57 crc kubenswrapper[4774]: I0127 00:22:57.696359 4774 generic.go:334] "Generic (PLEG): container finished" podID="1c7b5d79-5dff-4a72-9864-22416e852110" containerID="e38e1b3fb869e2dfbe9321a3bb8c8c6137628317c6b252437553f0df41c5c826" exitCode=0 Jan 27 00:22:57 crc kubenswrapper[4774]: I0127 00:22:57.696416 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerDied","Data":"e38e1b3fb869e2dfbe9321a3bb8c8c6137628317c6b252437553f0df41c5c826"} Jan 27 00:22:57 crc kubenswrapper[4774]: I0127 00:22:57.755193 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_1c7b5d79-5dff-4a72-9864-22416e852110/manage-dockerfile/0.log" Jan 27 00:22:58 crc kubenswrapper[4774]: I0127 00:22:58.706427 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerStarted","Data":"b39e5887c85163fcc961ded078df14a493b8d525c609add8e875af954134fc02"} Jan 27 00:22:58 crc kubenswrapper[4774]: I0127 00:22:58.744722 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.74468799 podStartE2EDuration="5.74468799s" podCreationTimestamp="2026-01-27 00:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:22:58.739460319 +0000 UTC m=+957.045237213" watchObservedRunningTime="2026-01-27 00:22:58.74468799 +0000 UTC m=+957.050464914" Jan 27 00:23:06 crc kubenswrapper[4774]: I0127 00:23:06.675690 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:23:06 crc kubenswrapper[4774]: I0127 00:23:06.676516 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.675798 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.676930 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.677015 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.678138 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:23:36 crc kubenswrapper[4774]: I0127 00:23:36.678282 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29" gracePeriod=600 Jan 27 00:23:39 crc kubenswrapper[4774]: I0127 00:23:39.210182 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29" exitCode=0 Jan 27 00:23:39 crc kubenswrapper[4774]: I0127 00:23:39.210878 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29"} Jan 27 00:23:39 crc kubenswrapper[4774]: I0127 00:23:39.210908 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb"} Jan 27 00:23:39 crc kubenswrapper[4774]: I0127 00:23:39.210926 4774 scope.go:117] "RemoveContainer" containerID="b1c2487de37d74b3854324adcbb324d646465194c27d05f07ed619de40219442" Jan 27 00:24:06 crc kubenswrapper[4774]: I0127 00:24:06.478517 4774 generic.go:334] "Generic (PLEG): container finished" podID="1c7b5d79-5dff-4a72-9864-22416e852110" containerID="b39e5887c85163fcc961ded078df14a493b8d525c609add8e875af954134fc02" exitCode=0 Jan 27 00:24:06 crc kubenswrapper[4774]: I0127 00:24:06.479145 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerDied","Data":"b39e5887c85163fcc961ded078df14a493b8d525c609add8e875af954134fc02"} Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.824220 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.890541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.890940 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.891294 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.891543 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.891845 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.892764 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.893403 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.893698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.894034 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.894698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.895739 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.896234 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.895020 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.895620 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.895669 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.897499 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.898656 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") pod \"1c7b5d79-5dff-4a72-9864-22416e852110\" (UID: \"1c7b5d79-5dff-4a72-9864-22416e852110\") " Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.898712 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.899926 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.900182 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.900460 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.900641 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/1c7b5d79-5dff-4a72-9864-22416e852110-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.900818 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1c7b5d79-5dff-4a72-9864-22416e852110-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.901061 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.907562 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.909211 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.912158 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:24:07 crc kubenswrapper[4774]: I0127 00:24:07.915174 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b" (OuterVolumeSpecName: "kube-api-access-69k7b") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "kube-api-access-69k7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.002757 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69k7b\" (UniqueName: \"kubernetes.io/projected/1c7b5d79-5dff-4a72-9864-22416e852110-kube-api-access-69k7b\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.002803 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.002819 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.002834 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/1c7b5d79-5dff-4a72-9864-22416e852110-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.075658 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.103330 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.510372 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"1c7b5d79-5dff-4a72-9864-22416e852110","Type":"ContainerDied","Data":"6ba529326fadbe0cbf460b5a864b9676a02a66194862dcf2709f9808afc41984"} Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.510802 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba529326fadbe0cbf460b5a864b9676a02a66194862dcf2709f9808afc41984" Jan 27 00:24:08 crc kubenswrapper[4774]: I0127 00:24:08.510953 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Jan 27 00:24:09 crc kubenswrapper[4774]: I0127 00:24:09.978456 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "1c7b5d79-5dff-4a72-9864-22416e852110" (UID: "1c7b5d79-5dff-4a72-9864-22416e852110"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:10 crc kubenswrapper[4774]: I0127 00:24:10.039639 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/1c7b5d79-5dff-4a72-9864-22416e852110-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.507860 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:12 crc kubenswrapper[4774]: E0127 00:24:12.508538 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="docker-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.508553 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="docker-build" Jan 27 00:24:12 crc kubenswrapper[4774]: E0127 00:24:12.508585 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="git-clone" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.508594 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="git-clone" Jan 27 00:24:12 crc kubenswrapper[4774]: E0127 00:24:12.508604 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="manage-dockerfile" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.508611 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="manage-dockerfile" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.508736 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7b5d79-5dff-4a72-9864-22416e852110" containerName="docker-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.509536 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.512196 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.512372 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.512492 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.512539 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.533209 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580437 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580495 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580537 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580583 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580612 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580638 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580759 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.580998 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.581049 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.581097 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.581130 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.581179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.682970 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683106 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683242 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683244 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683320 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683354 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683660 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683742 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.683853 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684017 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684652 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684563 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684513 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.684901 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.685039 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.685318 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.691523 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.691641 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.705591 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") pod \"sg-core-1-build\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " pod="service-telemetry/sg-core-1-build" Jan 27 00:24:12 crc kubenswrapper[4774]: I0127 00:24:12.827636 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Jan 27 00:24:13 crc kubenswrapper[4774]: I0127 00:24:13.136949 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:13 crc kubenswrapper[4774]: I0127 00:24:13.573957 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerStarted","Data":"f77c18044669e0ee98035472ea754e01d9c30dbaba522e3d30ba4cdb3c4735ff"} Jan 27 00:24:14 crc kubenswrapper[4774]: I0127 00:24:14.588590 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerID="4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9" exitCode=0 Jan 27 00:24:14 crc kubenswrapper[4774]: I0127 00:24:14.588752 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerDied","Data":"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9"} Jan 27 00:24:15 crc kubenswrapper[4774]: I0127 00:24:15.604320 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerStarted","Data":"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb"} Jan 27 00:24:15 crc kubenswrapper[4774]: I0127 00:24:15.630505 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=3.630481305 podStartE2EDuration="3.630481305s" podCreationTimestamp="2026-01-27 00:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:24:15.627277179 +0000 UTC m=+1033.933054073" watchObservedRunningTime="2026-01-27 00:24:15.630481305 +0000 UTC m=+1033.936258209" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.035562 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.036336 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="docker-build" containerID="cri-o://5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" gracePeriod=30 Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.417648 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_ba507aea-37a7-4fb3-b2f1-64fb8966d9a4/docker-build/0.log" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.418795 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470078 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470153 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470237 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470304 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470341 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470401 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470510 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470553 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470665 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470772 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.470818 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") pod \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\" (UID: \"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4\") " Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.471195 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.471406 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.471434 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.472670 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.472809 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.472931 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.473088 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.473135 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.477661 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.478152 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh" (OuterVolumeSpecName: "kube-api-access-4qthh") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "kube-api-access-4qthh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.479252 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573255 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573289 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qthh\" (UniqueName: \"kubernetes.io/projected/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-kube-api-access-4qthh\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573298 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573312 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573321 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573329 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573339 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.573349 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.579172 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.615989 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" (UID: "ba507aea-37a7-4fb3-b2f1-64fb8966d9a4"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.671798 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_ba507aea-37a7-4fb3-b2f1-64fb8966d9a4/docker-build/0.log" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673041 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerID="5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" exitCode=1 Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673106 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerDied","Data":"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb"} Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673154 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"ba507aea-37a7-4fb3-b2f1-64fb8966d9a4","Type":"ContainerDied","Data":"f77c18044669e0ee98035472ea754e01d9c30dbaba522e3d30ba4cdb3c4735ff"} Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673162 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.673184 4774 scope.go:117] "RemoveContainer" containerID="5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.674660 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.674712 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.726547 4774 scope.go:117] "RemoveContainer" containerID="4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.736292 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.744843 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.785607 4774 scope.go:117] "RemoveContainer" containerID="5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" Jan 27 00:24:23 crc kubenswrapper[4774]: E0127 00:24:23.786416 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb\": container with ID starting with 5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb not found: ID does not exist" containerID="5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.786492 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb"} err="failed to get container status \"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb\": rpc error: code = NotFound desc = could not find container \"5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb\": container with ID starting with 5bac0179610e2a0f6042ecf1963c8cdabcd51dde56c8043b0b02a5ae47b325cb not found: ID does not exist" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.786541 4774 scope.go:117] "RemoveContainer" containerID="4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9" Jan 27 00:24:23 crc kubenswrapper[4774]: E0127 00:24:23.787941 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9\": container with ID starting with 4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9 not found: ID does not exist" containerID="4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9" Jan 27 00:24:23 crc kubenswrapper[4774]: I0127 00:24:23.788019 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9"} err="failed to get container status \"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9\": rpc error: code = NotFound desc = could not find container \"4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9\": container with ID starting with 4ed5bd56e89b741018bcb1aca994d92297fbdda79de6139f62b0b9bd1d7ac9c9 not found: ID does not exist" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.371482 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" path="/var/lib/kubelet/pods/ba507aea-37a7-4fb3-b2f1-64fb8966d9a4/volumes" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.845920 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Jan 27 00:24:24 crc kubenswrapper[4774]: E0127 00:24:24.846575 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="docker-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.846604 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="docker-build" Jan 27 00:24:24 crc kubenswrapper[4774]: E0127 00:24:24.846638 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="manage-dockerfile" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.846652 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="manage-dockerfile" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.846831 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba507aea-37a7-4fb3-b2f1-64fb8966d9a4" containerName="docker-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.848319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.852073 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.852337 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.853441 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.853689 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.878070 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894424 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894503 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894675 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894761 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894810 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894912 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.894967 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.895021 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.895160 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.895261 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.895302 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998124 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998237 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998302 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998394 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998426 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998444 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998465 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998618 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998700 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998746 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998832 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998935 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.998989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.999230 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.999276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:24 crc kubenswrapper[4774]: I0127 00:24:24.999673 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.000497 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.000691 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.001046 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.010102 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.010579 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.026646 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") pod \"sg-core-2-build\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.163640 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.453200 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Jan 27 00:24:25 crc kubenswrapper[4774]: I0127 00:24:25.692457 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerStarted","Data":"95377822472fd85dfa0453210cab1aec6065d108dc6943b9c636d085246845d4"} Jan 27 00:24:26 crc kubenswrapper[4774]: I0127 00:24:26.706024 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerStarted","Data":"6438ea88db110cce5cbdfae5af7c90136af6a8c7be2b9b1644fc72bd6a2df7bd"} Jan 27 00:24:27 crc kubenswrapper[4774]: I0127 00:24:27.716282 4774 generic.go:334] "Generic (PLEG): container finished" podID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerID="6438ea88db110cce5cbdfae5af7c90136af6a8c7be2b9b1644fc72bd6a2df7bd" exitCode=0 Jan 27 00:24:27 crc kubenswrapper[4774]: I0127 00:24:27.716326 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerDied","Data":"6438ea88db110cce5cbdfae5af7c90136af6a8c7be2b9b1644fc72bd6a2df7bd"} Jan 27 00:24:28 crc kubenswrapper[4774]: I0127 00:24:28.727122 4774 generic.go:334] "Generic (PLEG): container finished" podID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerID="6b54eca9456d1e4d144051ab9ec59ce55311d9899f1cb227c509f4950fa46689" exitCode=0 Jan 27 00:24:28 crc kubenswrapper[4774]: I0127 00:24:28.727195 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerDied","Data":"6b54eca9456d1e4d144051ab9ec59ce55311d9899f1cb227c509f4950fa46689"} Jan 27 00:24:28 crc kubenswrapper[4774]: I0127 00:24:28.771229 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_62f2cb11-10c8-4570-a3f6-3e869ea9dfd4/manage-dockerfile/0.log" Jan 27 00:24:29 crc kubenswrapper[4774]: I0127 00:24:29.741487 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerStarted","Data":"12e30af316e67a99c16a4887b96ef7142665aad739f15328a784cb874cc31ed0"} Jan 27 00:24:29 crc kubenswrapper[4774]: I0127 00:24:29.784504 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.78446784 podStartE2EDuration="5.78446784s" podCreationTimestamp="2026-01-27 00:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:24:29.783771751 +0000 UTC m=+1048.089548675" watchObservedRunningTime="2026-01-27 00:24:29.78446784 +0000 UTC m=+1048.090244794" Jan 27 00:26:06 crc kubenswrapper[4774]: I0127 00:26:06.675317 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:26:06 crc kubenswrapper[4774]: I0127 00:26:06.675956 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:26:36 crc kubenswrapper[4774]: I0127 00:26:36.675449 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:26:36 crc kubenswrapper[4774]: I0127 00:26:36.676265 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.675711 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.676756 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.676842 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.678008 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:27:06 crc kubenswrapper[4774]: I0127 00:27:06.678127 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb" gracePeriod=600 Jan 27 00:27:07 crc kubenswrapper[4774]: I0127 00:27:07.976066 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb" exitCode=0 Jan 27 00:27:07 crc kubenswrapper[4774]: I0127 00:27:07.976133 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb"} Jan 27 00:27:07 crc kubenswrapper[4774]: I0127 00:27:07.977048 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5"} Jan 27 00:27:07 crc kubenswrapper[4774]: I0127 00:27:07.977096 4774 scope.go:117] "RemoveContainer" containerID="9eedfdb07f2482a1f5cd24c38dd8c573ca8994af4d572b2eee7e79784fd3eb29" Jan 27 00:27:43 crc kubenswrapper[4774]: I0127 00:27:43.288337 4774 generic.go:334] "Generic (PLEG): container finished" podID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerID="12e30af316e67a99c16a4887b96ef7142665aad739f15328a784cb874cc31ed0" exitCode=0 Jan 27 00:27:43 crc kubenswrapper[4774]: I0127 00:27:43.288406 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerDied","Data":"12e30af316e67a99c16a4887b96ef7142665aad739f15328a784cb874cc31ed0"} Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.673177 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.831318 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832170 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832213 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832245 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832280 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832317 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832342 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832453 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.832470 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833165 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833199 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833247 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833320 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833248 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833443 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833492 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") pod \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\" (UID: \"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4\") " Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.834360 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833537 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.833559 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.834398 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.834507 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.834538 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.840468 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.842463 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9" (OuterVolumeSpecName: "kube-api-access-4blp9") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "kube-api-access-4blp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.843970 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.852370 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.936936 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4blp9\" (UniqueName: \"kubernetes.io/projected/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-kube-api-access-4blp9\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.936995 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.937018 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.937040 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.937057 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:44 crc kubenswrapper[4774]: I0127 00:27:44.937074 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.191219 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.241778 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.312090 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"62f2cb11-10c8-4570-a3f6-3e869ea9dfd4","Type":"ContainerDied","Data":"95377822472fd85dfa0453210cab1aec6065d108dc6943b9c636d085246845d4"} Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.312158 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Jan 27 00:27:45 crc kubenswrapper[4774]: I0127 00:27:45.312163 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95377822472fd85dfa0453210cab1aec6065d108dc6943b9c636d085246845d4" Jan 27 00:27:48 crc kubenswrapper[4774]: I0127 00:27:48.005741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" (UID: "62f2cb11-10c8-4570-a3f6-3e869ea9dfd4"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:27:48 crc kubenswrapper[4774]: I0127 00:27:48.097182 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/62f2cb11-10c8-4570-a3f6-3e869ea9dfd4-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.670805 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:27:49 crc kubenswrapper[4774]: E0127 00:27:49.672476 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="docker-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.672607 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="docker-build" Jan 27 00:27:49 crc kubenswrapper[4774]: E0127 00:27:49.672676 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="git-clone" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.672835 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="git-clone" Jan 27 00:27:49 crc kubenswrapper[4774]: E0127 00:27:49.672923 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="manage-dockerfile" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.672980 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="manage-dockerfile" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.673157 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f2cb11-10c8-4570-a3f6-3e869ea9dfd4" containerName="docker-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.674450 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.677403 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.677843 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.677897 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.678115 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.697797 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828676 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828736 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828771 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828787 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828806 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828939 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.828996 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829028 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829051 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829267 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829340 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.829395 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931244 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931341 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931404 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931445 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931498 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931528 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931593 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931713 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931787 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931835 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.931906 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932320 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932442 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932482 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932520 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.932696 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.933147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.933441 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.933742 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.934347 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.941231 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.943854 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:49 crc kubenswrapper[4774]: I0127 00:27:49.956764 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") pod \"sg-bridge-1-build\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:50 crc kubenswrapper[4774]: I0127 00:27:50.001956 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Jan 27 00:27:50 crc kubenswrapper[4774]: I0127 00:27:50.321908 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:27:50 crc kubenswrapper[4774]: I0127 00:27:50.370046 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerStarted","Data":"ea252ad661071f08d5d4dbb63efce080e36285b3984e1fe7d27a8a1b1c69df27"} Jan 27 00:27:51 crc kubenswrapper[4774]: I0127 00:27:51.380326 4774 generic.go:334] "Generic (PLEG): container finished" podID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerID="f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f" exitCode=0 Jan 27 00:27:51 crc kubenswrapper[4774]: I0127 00:27:51.380399 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerDied","Data":"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f"} Jan 27 00:27:52 crc kubenswrapper[4774]: I0127 00:27:52.391723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerStarted","Data":"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11"} Jan 27 00:27:52 crc kubenswrapper[4774]: I0127 00:27:52.438485 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.438457909 podStartE2EDuration="3.438457909s" podCreationTimestamp="2026-01-27 00:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:27:52.435378446 +0000 UTC m=+1250.741155330" watchObservedRunningTime="2026-01-27 00:27:52.438457909 +0000 UTC m=+1250.744234823" Jan 27 00:27:59 crc kubenswrapper[4774]: I0127 00:27:59.977777 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:27:59 crc kubenswrapper[4774]: I0127 00:27:59.979125 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="docker-build" containerID="cri-o://fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" gracePeriod=30 Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.306565 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_5ec0c529-6bbd-41cf-bc74-f8a226f562ee/docker-build/0.log" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.307527 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.408698 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.408842 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.408883 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.408913 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409049 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409174 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409265 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409482 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409542 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409771 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409839 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.409913 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") pod \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\" (UID: \"5ec0c529-6bbd-41cf-bc74-f8a226f562ee\") " Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.410324 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.410675 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.410722 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.411375 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.412501 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.412517 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.412716 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.412854 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.422784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt" (OuterVolumeSpecName: "kube-api-access-2xxrt") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "kube-api-access-2xxrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.422800 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.425315 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.473129 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_5ec0c529-6bbd-41cf-bc74-f8a226f562ee/docker-build/0.log" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.474416 4774 generic.go:334] "Generic (PLEG): container finished" podID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerID="fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" exitCode=1 Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.474467 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerDied","Data":"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11"} Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.474542 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"5ec0c529-6bbd-41cf-bc74-f8a226f562ee","Type":"ContainerDied","Data":"ea252ad661071f08d5d4dbb63efce080e36285b3984e1fe7d27a8a1b1c69df27"} Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.474663 4774 scope.go:117] "RemoveContainer" containerID="fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.475001 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.498434 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.504253 4774 scope.go:117] "RemoveContainer" containerID="f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513297 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513331 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513345 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513358 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513370 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513381 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513392 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xxrt\" (UniqueName: \"kubernetes.io/projected/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-kube-api-access-2xxrt\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513404 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.513566 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.536372 4774 scope.go:117] "RemoveContainer" containerID="fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" Jan 27 00:28:00 crc kubenswrapper[4774]: E0127 00:28:00.537161 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11\": container with ID starting with fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11 not found: ID does not exist" containerID="fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.537238 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11"} err="failed to get container status \"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11\": rpc error: code = NotFound desc = could not find container \"fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11\": container with ID starting with fd6b5e8f7a62d9f47cac43d6da64505dc55bd83275a03bbb2b9bea039d751a11 not found: ID does not exist" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.537283 4774 scope.go:117] "RemoveContainer" containerID="f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f" Jan 27 00:28:00 crc kubenswrapper[4774]: E0127 00:28:00.537806 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f\": container with ID starting with f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f not found: ID does not exist" containerID="f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.537934 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f"} err="failed to get container status \"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f\": rpc error: code = NotFound desc = could not find container \"f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f\": container with ID starting with f01ade6d30c8fd99cdb80e8d3232f9a2c0baa14ee09c308587093514dc70ea5f not found: ID does not exist" Jan 27 00:28:00 crc kubenswrapper[4774]: I0127 00:28:00.986260 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5ec0c529-6bbd-41cf-bc74-f8a226f562ee" (UID: "5ec0c529-6bbd-41cf-bc74-f8a226f562ee"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.021984 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5ec0c529-6bbd-41cf-bc74-f8a226f562ee-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.120977 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.134666 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.672009 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Jan 27 00:28:01 crc kubenswrapper[4774]: E0127 00:28:01.672650 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="manage-dockerfile" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.672737 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="manage-dockerfile" Jan 27 00:28:01 crc kubenswrapper[4774]: E0127 00:28:01.672834 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="docker-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.672982 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="docker-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.673172 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" containerName="docker-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.674350 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.678820 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.679244 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.679344 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.679404 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.711391 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.735899 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.735986 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736065 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736109 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736147 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736207 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736236 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736267 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736326 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736359 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.736410 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.836908 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837180 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837279 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837385 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837459 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837607 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837692 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837767 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837852 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838471 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.837409 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838187 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838231 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838086 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838525 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838644 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838707 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.838892 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.839162 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.839161 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.841338 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.842277 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.857324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") pod \"sg-bridge-2-build\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:01 crc kubenswrapper[4774]: I0127 00:28:01.995488 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:02 crc kubenswrapper[4774]: I0127 00:28:02.273540 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Jan 27 00:28:02 crc kubenswrapper[4774]: I0127 00:28:02.372154 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec0c529-6bbd-41cf-bc74-f8a226f562ee" path="/var/lib/kubelet/pods/5ec0c529-6bbd-41cf-bc74-f8a226f562ee/volumes" Jan 27 00:28:02 crc kubenswrapper[4774]: I0127 00:28:02.501300 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerStarted","Data":"f4695ee39043ee1cf7334009731d6730fe0de886ad8fee5d4726ff3874944e3a"} Jan 27 00:28:03 crc kubenswrapper[4774]: I0127 00:28:03.509161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerStarted","Data":"e54a1121c6dba1bbac9e3c48e316c05cc45c49c651516fbd46a91ccf1b04c0cd"} Jan 27 00:28:04 crc kubenswrapper[4774]: I0127 00:28:04.519926 4774 generic.go:334] "Generic (PLEG): container finished" podID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerID="e54a1121c6dba1bbac9e3c48e316c05cc45c49c651516fbd46a91ccf1b04c0cd" exitCode=0 Jan 27 00:28:04 crc kubenswrapper[4774]: I0127 00:28:04.520069 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerDied","Data":"e54a1121c6dba1bbac9e3c48e316c05cc45c49c651516fbd46a91ccf1b04c0cd"} Jan 27 00:28:05 crc kubenswrapper[4774]: I0127 00:28:05.531467 4774 generic.go:334] "Generic (PLEG): container finished" podID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerID="9be9db062f94ba7560ebf270e6cdf2ea9fba53b2de3e472a109511b2d455e78e" exitCode=0 Jan 27 00:28:05 crc kubenswrapper[4774]: I0127 00:28:05.531562 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerDied","Data":"9be9db062f94ba7560ebf270e6cdf2ea9fba53b2de3e472a109511b2d455e78e"} Jan 27 00:28:05 crc kubenswrapper[4774]: I0127 00:28:05.572007 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_335eb417-b9c1-42a3-8725-2f18e5dab7a8/manage-dockerfile/0.log" Jan 27 00:28:06 crc kubenswrapper[4774]: I0127 00:28:06.543778 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerStarted","Data":"22ca06acc03323344bb1ef113c9898343cd315ad78e6e01547677009e369b46d"} Jan 27 00:28:06 crc kubenswrapper[4774]: I0127 00:28:06.595225 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=5.595194673 podStartE2EDuration="5.595194673s" podCreationTimestamp="2026-01-27 00:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:28:06.585508873 +0000 UTC m=+1264.891285847" watchObservedRunningTime="2026-01-27 00:28:06.595194673 +0000 UTC m=+1264.900971577" Jan 27 00:28:54 crc kubenswrapper[4774]: I0127 00:28:54.957386 4774 generic.go:334] "Generic (PLEG): container finished" podID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerID="22ca06acc03323344bb1ef113c9898343cd315ad78e6e01547677009e369b46d" exitCode=0 Jan 27 00:28:54 crc kubenswrapper[4774]: I0127 00:28:54.957488 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerDied","Data":"22ca06acc03323344bb1ef113c9898343cd315ad78e6e01547677009e369b46d"} Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.351562 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.452655 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453198 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453276 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453309 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453390 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453424 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453450 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453475 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453531 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453542 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453576 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453664 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.453713 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") pod \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\" (UID: \"335eb417-b9c1-42a3-8725-2f18e5dab7a8\") " Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.454057 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.454187 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.454776 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.454847 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.455539 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.455750 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.455869 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.461006 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.461023 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25" (OuterVolumeSpecName: "kube-api-access-nzn25") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "kube-api-access-nzn25". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.461065 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555893 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555931 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555943 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555953 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555962 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/335eb417-b9c1-42a3-8725-2f18e5dab7a8-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555971 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzn25\" (UniqueName: \"kubernetes.io/projected/335eb417-b9c1-42a3-8725-2f18e5dab7a8-kube-api-access-nzn25\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555980 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.555989 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.556000 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.592175 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.658154 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.978741 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"335eb417-b9c1-42a3-8725-2f18e5dab7a8","Type":"ContainerDied","Data":"f4695ee39043ee1cf7334009731d6730fe0de886ad8fee5d4726ff3874944e3a"} Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.978788 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4695ee39043ee1cf7334009731d6730fe0de886ad8fee5d4726ff3874944e3a" Jan 27 00:28:56 crc kubenswrapper[4774]: I0127 00:28:56.978918 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Jan 27 00:28:57 crc kubenswrapper[4774]: I0127 00:28:57.345921 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "335eb417-b9c1-42a3-8725-2f18e5dab7a8" (UID: "335eb417-b9c1-42a3-8725-2f18e5dab7a8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:28:57 crc kubenswrapper[4774]: I0127 00:28:57.370069 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/335eb417-b9c1-42a3-8725-2f18e5dab7a8-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.932854 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:00 crc kubenswrapper[4774]: E0127 00:29:00.933652 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="manage-dockerfile" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.933670 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="manage-dockerfile" Jan 27 00:29:00 crc kubenswrapper[4774]: E0127 00:29:00.933694 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="docker-build" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.933703 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="docker-build" Jan 27 00:29:00 crc kubenswrapper[4774]: E0127 00:29:00.933725 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="git-clone" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.933735 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="git-clone" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.933915 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="335eb417-b9c1-42a3-8725-2f18e5dab7a8" containerName="docker-build" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.935054 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.937699 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.937910 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.937924 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.938168 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Jan 27 00:29:00 crc kubenswrapper[4774]: I0127 00:29:00.958112 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083406 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083460 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083506 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083754 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.083970 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084014 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084058 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084097 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084150 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084264 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.084371 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185510 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185588 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185626 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185670 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185702 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185741 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185779 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185936 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186314 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.185795 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186526 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186695 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186795 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.186907 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187040 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187241 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187427 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187553 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.187536 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.188015 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.188181 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.193537 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.201270 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.207412 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.256261 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:01 crc kubenswrapper[4774]: I0127 00:29:01.529952 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:02 crc kubenswrapper[4774]: I0127 00:29:02.025442 4774 generic.go:334] "Generic (PLEG): container finished" podID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerID="4f2d7297d4d28dd848f59b341d688f7920adc8f241b7e1da6ad4079f86054ae7" exitCode=0 Jan 27 00:29:02 crc kubenswrapper[4774]: I0127 00:29:02.025535 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerDied","Data":"4f2d7297d4d28dd848f59b341d688f7920adc8f241b7e1da6ad4079f86054ae7"} Jan 27 00:29:02 crc kubenswrapper[4774]: I0127 00:29:02.026000 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerStarted","Data":"6400856724c89fbcf80eb83da2af95703f59cc0a5ae533256599b067df3217f6"} Jan 27 00:29:03 crc kubenswrapper[4774]: I0127 00:29:03.037552 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerStarted","Data":"1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955"} Jan 27 00:29:03 crc kubenswrapper[4774]: I0127 00:29:03.068495 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.068450684 podStartE2EDuration="3.068450684s" podCreationTimestamp="2026-01-27 00:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:29:03.062208786 +0000 UTC m=+1321.367985660" watchObservedRunningTime="2026-01-27 00:29:03.068450684 +0000 UTC m=+1321.374227578" Jan 27 00:29:11 crc kubenswrapper[4774]: I0127 00:29:11.724354 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:11 crc kubenswrapper[4774]: I0127 00:29:11.726404 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="docker-build" containerID="cri-o://1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955" gracePeriod=30 Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.127561 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e6b5f484-b9d4-4133-8d1f-af31e71ea317/docker-build/0.log" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129110 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e6b5f484-b9d4-4133-8d1f-af31e71ea317/docker-build/0.log" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129143 4774 generic.go:334] "Generic (PLEG): container finished" podID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerID="1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955" exitCode=1 Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129186 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerDied","Data":"1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955"} Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129234 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e6b5f484-b9d4-4133-8d1f-af31e71ea317","Type":"ContainerDied","Data":"6400856724c89fbcf80eb83da2af95703f59cc0a5ae533256599b067df3217f6"} Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.129249 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6400856724c89fbcf80eb83da2af95703f59cc0a5ae533256599b067df3217f6" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.130479 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269628 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269741 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269796 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269854 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269914 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.269963 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270016 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270039 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270152 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270181 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270203 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270225 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") pod \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\" (UID: \"e6b5f484-b9d4-4133-8d1f-af31e71ea317\") " Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270530 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.270617 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.271170 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.272046 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.272345 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.272353 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.273480 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.282542 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.283196 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.285294 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w" (OuterVolumeSpecName: "kube-api-access-4ms8w") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "kube-api-access-4ms8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.351770 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371499 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371543 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371563 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371580 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371600 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371617 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371636 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/e6b5f484-b9d4-4133-8d1f-af31e71ea317-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371654 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6b5f484-b9d4-4133-8d1f-af31e71ea317-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371672 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371690 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.371706 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ms8w\" (UniqueName: \"kubernetes.io/projected/e6b5f484-b9d4-4133-8d1f-af31e71ea317-kube-api-access-4ms8w\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.686929 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e6b5f484-b9d4-4133-8d1f-af31e71ea317" (UID: "e6b5f484-b9d4-4133-8d1f-af31e71ea317"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:29:12 crc kubenswrapper[4774]: I0127 00:29:12.776495 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e6b5f484-b9d4-4133-8d1f-af31e71ea317-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.140625 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.209202 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.216479 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.298479 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Jan 27 00:29:13 crc kubenswrapper[4774]: E0127 00:29:13.298901 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="manage-dockerfile" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.298929 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="manage-dockerfile" Jan 27 00:29:13 crc kubenswrapper[4774]: E0127 00:29:13.298949 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="docker-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.298960 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="docker-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.299184 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" containerName="docker-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.300412 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.303124 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.303444 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.303671 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.306458 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.326166 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.487989 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488043 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488074 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488093 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488197 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488218 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488235 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488269 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488289 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488346 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488372 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.488392 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590156 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590226 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590253 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590347 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590372 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590391 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590411 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590438 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590486 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590517 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590543 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.590566 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591296 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591401 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591500 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591511 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591538 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591721 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591839 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.591905 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.592600 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.597716 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.597715 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.622261 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:13 crc kubenswrapper[4774]: I0127 00:29:13.921531 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:29:14 crc kubenswrapper[4774]: I0127 00:29:14.185660 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Jan 27 00:29:14 crc kubenswrapper[4774]: I0127 00:29:14.374560 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b5f484-b9d4-4133-8d1f-af31e71ea317" path="/var/lib/kubelet/pods/e6b5f484-b9d4-4133-8d1f-af31e71ea317/volumes" Jan 27 00:29:15 crc kubenswrapper[4774]: I0127 00:29:15.157761 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerStarted","Data":"49dbef974c4cda05528d3933f9e63b2acb525fc6f9b443e92b13b25b2daad9a3"} Jan 27 00:29:15 crc kubenswrapper[4774]: I0127 00:29:15.158301 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerStarted","Data":"8cd691b05592e7b3c264b7fbed06d48d81cd99fb31ca0c955cfed3fd066caee3"} Jan 27 00:29:16 crc kubenswrapper[4774]: I0127 00:29:16.165803 4774 generic.go:334] "Generic (PLEG): container finished" podID="50bd722f-c392-4f78-99af-2009d9b35de2" containerID="49dbef974c4cda05528d3933f9e63b2acb525fc6f9b443e92b13b25b2daad9a3" exitCode=0 Jan 27 00:29:16 crc kubenswrapper[4774]: I0127 00:29:16.165878 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerDied","Data":"49dbef974c4cda05528d3933f9e63b2acb525fc6f9b443e92b13b25b2daad9a3"} Jan 27 00:29:17 crc kubenswrapper[4774]: I0127 00:29:17.177089 4774 generic.go:334] "Generic (PLEG): container finished" podID="50bd722f-c392-4f78-99af-2009d9b35de2" containerID="70ff28ac8a42b280187899cd540772bb0368aa9b3ed60593bb29316ad8ccb06a" exitCode=0 Jan 27 00:29:17 crc kubenswrapper[4774]: I0127 00:29:17.177216 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerDied","Data":"70ff28ac8a42b280187899cd540772bb0368aa9b3ed60593bb29316ad8ccb06a"} Jan 27 00:29:17 crc kubenswrapper[4774]: I0127 00:29:17.230677 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_50bd722f-c392-4f78-99af-2009d9b35de2/manage-dockerfile/0.log" Jan 27 00:29:18 crc kubenswrapper[4774]: I0127 00:29:18.194384 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerStarted","Data":"6a4457f6719b8c78a9bdd9efd2a65239dabc2a80fae44a0533533901adf4c519"} Jan 27 00:29:18 crc kubenswrapper[4774]: I0127 00:29:18.241415 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.241201682 podStartE2EDuration="5.241201682s" podCreationTimestamp="2026-01-27 00:29:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:29:18.235738025 +0000 UTC m=+1336.541514919" watchObservedRunningTime="2026-01-27 00:29:18.241201682 +0000 UTC m=+1336.546978566" Jan 27 00:29:36 crc kubenswrapper[4774]: I0127 00:29:36.675087 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:29:36 crc kubenswrapper[4774]: I0127 00:29:36.675766 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.736764 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.739583 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.753690 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.902938 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.903025 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:51 crc kubenswrapper[4774]: I0127 00:29:51.903792 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.004731 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.004784 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.004827 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.005253 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.005415 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.031323 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") pod \"redhat-operators-672r9\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.057807 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.307840 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:29:52 crc kubenswrapper[4774]: I0127 00:29:52.481605 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerStarted","Data":"bc478ee9fcc61b8b2a27262557b35effaf576b0e163905e01aa0bfa4d5e05938"} Jan 27 00:29:53 crc kubenswrapper[4774]: I0127 00:29:53.488853 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerID="39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e" exitCode=0 Jan 27 00:29:53 crc kubenswrapper[4774]: I0127 00:29:53.488971 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerDied","Data":"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e"} Jan 27 00:29:53 crc kubenswrapper[4774]: I0127 00:29:53.491064 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:29:55 crc kubenswrapper[4774]: I0127 00:29:55.505259 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerID="8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090" exitCode=0 Jan 27 00:29:55 crc kubenswrapper[4774]: I0127 00:29:55.505391 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerDied","Data":"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090"} Jan 27 00:29:56 crc kubenswrapper[4774]: I0127 00:29:56.513253 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerStarted","Data":"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0"} Jan 27 00:29:56 crc kubenswrapper[4774]: I0127 00:29:56.534682 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-672r9" podStartSLOduration=3.092441609 podStartE2EDuration="5.53466119s" podCreationTimestamp="2026-01-27 00:29:51 +0000 UTC" firstStartedPulling="2026-01-27 00:29:53.490782947 +0000 UTC m=+1371.796559831" lastFinishedPulling="2026-01-27 00:29:55.933002528 +0000 UTC m=+1374.238779412" observedRunningTime="2026-01-27 00:29:56.531952627 +0000 UTC m=+1374.837729541" watchObservedRunningTime="2026-01-27 00:29:56.53466119 +0000 UTC m=+1374.840438094" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.168435 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v"] Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.169835 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.176092 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.176713 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.201889 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v"] Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.321642 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.321733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.321971 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.423337 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.423413 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.423947 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.424617 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.431547 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.451409 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") pod \"collect-profiles-29491230-tjg9v\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.511208 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:00 crc kubenswrapper[4774]: I0127 00:30:00.725431 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v"] Jan 27 00:30:00 crc kubenswrapper[4774]: W0127 00:30:00.729149 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd04202_19c2_4d82_9a03_ddd3ddbc35d0.slice/crio-14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3 WatchSource:0}: Error finding container 14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3: Status 404 returned error can't find the container with id 14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3 Jan 27 00:30:01 crc kubenswrapper[4774]: I0127 00:30:01.555985 4774 generic.go:334] "Generic (PLEG): container finished" podID="bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" containerID="950a86b020502de0870835a037b1b740c89163f25bd9f03d78753bef170c3ab0" exitCode=0 Jan 27 00:30:01 crc kubenswrapper[4774]: I0127 00:30:01.556404 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" event={"ID":"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0","Type":"ContainerDied","Data":"950a86b020502de0870835a037b1b740c89163f25bd9f03d78753bef170c3ab0"} Jan 27 00:30:01 crc kubenswrapper[4774]: I0127 00:30:01.556438 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" event={"ID":"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0","Type":"ContainerStarted","Data":"14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3"} Jan 27 00:30:01 crc kubenswrapper[4774]: E0127 00:30:01.580636 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd04202_19c2_4d82_9a03_ddd3ddbc35d0.slice/crio-conmon-950a86b020502de0870835a037b1b740c89163f25bd9f03d78753bef170c3ab0.scope\": RecentStats: unable to find data in memory cache]" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.058823 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.058926 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.827461 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.964158 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") pod \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.964328 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") pod \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.964393 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") pod \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\" (UID: \"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0\") " Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.965382 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" (UID: "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.977696 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" (UID: "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:02 crc kubenswrapper[4774]: I0127 00:30:02.978599 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb" (OuterVolumeSpecName: "kube-api-access-p44fb") pod "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" (UID: "bfd04202-19c2-4d82-9a03-ddd3ddbc35d0"). InnerVolumeSpecName "kube-api-access-p44fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.066973 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.067033 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.067048 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p44fb\" (UniqueName: \"kubernetes.io/projected/bfd04202-19c2-4d82-9a03-ddd3ddbc35d0-kube-api-access-p44fb\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.109233 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-672r9" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" probeResult="failure" output=< Jan 27 00:30:03 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:30:03 crc kubenswrapper[4774]: > Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.582505 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" event={"ID":"bfd04202-19c2-4d82-9a03-ddd3ddbc35d0","Type":"ContainerDied","Data":"14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3"} Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.583004 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14ea47f815b25e1376017627be8dda2b43a8f0a4d624ad7bc7580eb453d067b3" Jan 27 00:30:03 crc kubenswrapper[4774]: I0127 00:30:03.582561 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491230-tjg9v" Jan 27 00:30:06 crc kubenswrapper[4774]: I0127 00:30:06.675883 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:30:06 crc kubenswrapper[4774]: I0127 00:30:06.676377 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:30:08 crc kubenswrapper[4774]: I0127 00:30:08.631620 4774 generic.go:334] "Generic (PLEG): container finished" podID="50bd722f-c392-4f78-99af-2009d9b35de2" containerID="6a4457f6719b8c78a9bdd9efd2a65239dabc2a80fae44a0533533901adf4c519" exitCode=0 Jan 27 00:30:08 crc kubenswrapper[4774]: I0127 00:30:08.631689 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerDied","Data":"6a4457f6719b8c78a9bdd9efd2a65239dabc2a80fae44a0533533901adf4c519"} Jan 27 00:30:09 crc kubenswrapper[4774]: I0127 00:30:09.958733 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077699 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077771 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077810 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077852 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077897 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077918 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.077988 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078025 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078061 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078093 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078113 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078142 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") pod \"50bd722f-c392-4f78-99af-2009d9b35de2\" (UID: \"50bd722f-c392-4f78-99af-2009d9b35de2\") " Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078196 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078700 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.078709 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.079203 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.080013 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.080060 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.080143 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086363 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086392 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086406 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086417 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086429 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/50bd722f-c392-4f78-99af-2009d9b35de2-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086442 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.086453 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50bd722f-c392-4f78-99af-2009d9b35de2-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.087103 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t" (OuterVolumeSpecName: "kube-api-access-jm86t") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "kube-api-access-jm86t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.087286 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.088428 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.187948 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.187982 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/50bd722f-c392-4f78-99af-2009d9b35de2-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.187993 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm86t\" (UniqueName: \"kubernetes.io/projected/50bd722f-c392-4f78-99af-2009d9b35de2-kube-api-access-jm86t\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.660492 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"50bd722f-c392-4f78-99af-2009d9b35de2","Type":"ContainerDied","Data":"8cd691b05592e7b3c264b7fbed06d48d81cd99fb31ca0c955cfed3fd066caee3"} Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.660925 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cd691b05592e7b3c264b7fbed06d48d81cd99fb31ca0c955cfed3fd066caee3" Jan 27 00:30:10 crc kubenswrapper[4774]: I0127 00:30:10.660615 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Jan 27 00:30:11 crc kubenswrapper[4774]: I0127 00:30:11.541897 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:11 crc kubenswrapper[4774]: I0127 00:30:11.610785 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:11 crc kubenswrapper[4774]: I0127 00:30:11.876529 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "50bd722f-c392-4f78-99af-2009d9b35de2" (UID: "50bd722f-c392-4f78-99af-2009d9b35de2"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:11 crc kubenswrapper[4774]: I0127 00:30:11.917913 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/50bd722f-c392-4f78-99af-2009d9b35de2-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:12 crc kubenswrapper[4774]: I0127 00:30:12.100096 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:12 crc kubenswrapper[4774]: I0127 00:30:12.150905 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:12 crc kubenswrapper[4774]: I0127 00:30:12.341368 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:30:13 crc kubenswrapper[4774]: I0127 00:30:13.699125 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-672r9" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" containerID="cri-o://e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" gracePeriod=2 Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.178020 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.358150 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") pod \"9e60fce2-faf3-4f7c-a798-668f9658a557\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.358625 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") pod \"9e60fce2-faf3-4f7c-a798-668f9658a557\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.358649 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") pod \"9e60fce2-faf3-4f7c-a798-668f9658a557\" (UID: \"9e60fce2-faf3-4f7c-a798-668f9658a557\") " Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.359491 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities" (OuterVolumeSpecName: "utilities") pod "9e60fce2-faf3-4f7c-a798-668f9658a557" (UID: "9e60fce2-faf3-4f7c-a798-668f9658a557"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.362641 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.366056 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd" (OuterVolumeSpecName: "kube-api-access-bbkqd") pod "9e60fce2-faf3-4f7c-a798-668f9658a557" (UID: "9e60fce2-faf3-4f7c-a798-668f9658a557"). InnerVolumeSpecName "kube-api-access-bbkqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.464196 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbkqd\" (UniqueName: \"kubernetes.io/projected/9e60fce2-faf3-4f7c-a798-668f9658a557-kube-api-access-bbkqd\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.494724 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e60fce2-faf3-4f7c-a798-668f9658a557" (UID: "9e60fce2-faf3-4f7c-a798-668f9658a557"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.565414 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e60fce2-faf3-4f7c-a798-668f9658a557-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708128 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerID="e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" exitCode=0 Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708191 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672r9" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708189 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerDied","Data":"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0"} Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708284 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672r9" event={"ID":"9e60fce2-faf3-4f7c-a798-668f9658a557","Type":"ContainerDied","Data":"bc478ee9fcc61b8b2a27262557b35effaf576b0e163905e01aa0bfa4d5e05938"} Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.708321 4774 scope.go:117] "RemoveContainer" containerID="e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.728175 4774 scope.go:117] "RemoveContainer" containerID="8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.745574 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.748972 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-672r9"] Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.763144 4774 scope.go:117] "RemoveContainer" containerID="39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.782961 4774 scope.go:117] "RemoveContainer" containerID="e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" Jan 27 00:30:14 crc kubenswrapper[4774]: E0127 00:30:14.783781 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0\": container with ID starting with e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0 not found: ID does not exist" containerID="e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.783831 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0"} err="failed to get container status \"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0\": rpc error: code = NotFound desc = could not find container \"e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0\": container with ID starting with e915b49b5906a0fe4efd612f7ff9368f1d41eecc1cfbed97b6441e4be1d512f0 not found: ID does not exist" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.783890 4774 scope.go:117] "RemoveContainer" containerID="8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090" Jan 27 00:30:14 crc kubenswrapper[4774]: E0127 00:30:14.784560 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090\": container with ID starting with 8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090 not found: ID does not exist" containerID="8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.784620 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090"} err="failed to get container status \"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090\": rpc error: code = NotFound desc = could not find container \"8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090\": container with ID starting with 8628b0f3e4073e83072f88fd5f72eb5f062cb79bb014400daf54b5505e314090 not found: ID does not exist" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.784659 4774 scope.go:117] "RemoveContainer" containerID="39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e" Jan 27 00:30:14 crc kubenswrapper[4774]: E0127 00:30:14.785247 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e\": container with ID starting with 39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e not found: ID does not exist" containerID="39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e" Jan 27 00:30:14 crc kubenswrapper[4774]: I0127 00:30:14.785268 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e"} err="failed to get container status \"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e\": rpc error: code = NotFound desc = could not find container \"39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e\": container with ID starting with 39a19f3d8685d0ec9c34bb73a3f02aa74dcb21423d3131842c498705b45b458e not found: ID does not exist" Jan 27 00:30:16 crc kubenswrapper[4774]: I0127 00:30:16.366507 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" path="/var/lib/kubelet/pods/9e60fce2-faf3-4f7c-a798-668f9658a557/volumes" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.152439 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154338 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="extract-content" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154365 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="extract-content" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154381 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" containerName="collect-profiles" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154389 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" containerName="collect-profiles" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154401 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="git-clone" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154410 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="git-clone" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154421 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="extract-utilities" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154429 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="extract-utilities" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154440 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154448 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154464 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="manage-dockerfile" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154472 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="manage-dockerfile" Jan 27 00:30:20 crc kubenswrapper[4774]: E0127 00:30:20.154490 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="docker-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154497 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="docker-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154639 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e60fce2-faf3-4f7c-a798-668f9658a557" containerName="registry-server" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154652 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd04202-19c2-4d82-9a03-ddd3ddbc35d0" containerName="collect-profiles" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.154670 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bd722f-c392-4f78-99af-2009d9b35de2" containerName="docker-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.155565 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.157797 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.158120 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.158714 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.162013 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.172118 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345331 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345386 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345461 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345569 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345614 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345641 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345689 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345732 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345779 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345804 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.345827 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.447778 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.447878 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.447929 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.447973 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448007 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448067 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448128 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448200 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448234 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448267 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448366 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.448553 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.449164 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.449317 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.449366 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.449832 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.450198 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.450669 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.450754 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.450790 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.456032 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.456706 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.469635 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.515039 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.949496 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.995926 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:20 crc kubenswrapper[4774]: I0127 00:30:20.997202 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.014744 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.056730 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.057024 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.057367 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.158546 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.158624 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.158662 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.159181 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.159276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.185019 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") pod \"community-operators-4hn7v\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.331409 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.643731 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.770004 4774 generic.go:334] "Generic (PLEG): container finished" podID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerID="7bdfb69e9fa9696e37a7d9b25d175b63e5343122a1e569faca1c08955f8bbb1d" exitCode=0 Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.770105 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"69f0575b-205f-45e8-9dc2-6dc9a58677b3","Type":"ContainerDied","Data":"7bdfb69e9fa9696e37a7d9b25d175b63e5343122a1e569faca1c08955f8bbb1d"} Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.770166 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"69f0575b-205f-45e8-9dc2-6dc9a58677b3","Type":"ContainerStarted","Data":"d81d42440ab00778c86f6fcb4c62ec165af109b644786779b55c38f5d05a2e58"} Jan 27 00:30:21 crc kubenswrapper[4774]: I0127 00:30:21.773061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerStarted","Data":"1d14e5556a15340dd3b0becb619b1db4a8136052b768e1c956c0feb6736d4ddb"} Jan 27 00:30:22 crc kubenswrapper[4774]: E0127 00:30:22.045831 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f2f66a_e5f3_4fa4_922f_048d076eb189.slice/crio-3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023.scope\": RecentStats: unable to find data in memory cache]" Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.784237 4774 generic.go:334] "Generic (PLEG): container finished" podID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerID="3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023" exitCode=0 Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.784369 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerDied","Data":"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023"} Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.787304 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_69f0575b-205f-45e8-9dc2-6dc9a58677b3/docker-build/0.log" Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.787781 4774 generic.go:334] "Generic (PLEG): container finished" podID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerID="89911886b25b8c950e2ce162b903638384fd14c3a7c90ec3cfa3cf3f5c1e0950" exitCode=1 Jan 27 00:30:22 crc kubenswrapper[4774]: I0127 00:30:22.787832 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"69f0575b-205f-45e8-9dc2-6dc9a58677b3","Type":"ContainerDied","Data":"89911886b25b8c950e2ce162b903638384fd14c3a7c90ec3cfa3cf3f5c1e0950"} Jan 27 00:30:23 crc kubenswrapper[4774]: I0127 00:30:23.797934 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerStarted","Data":"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0"} Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.158303 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_69f0575b-205f-45e8-9dc2-6dc9a58677b3/docker-build/0.log" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.159060 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.215618 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.215946 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216058 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216139 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216214 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216285 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216451 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216642 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216543 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216767 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216687 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216889 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216930 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.216955 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") pod \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\" (UID: \"69f0575b-205f-45e8-9dc2-6dc9a58677b3\") " Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217092 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217099 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217160 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217473 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217619 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217682 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217737 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217799 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217872 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217942 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/69f0575b-205f-45e8-9dc2-6dc9a58677b3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.218004 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.217727 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.218470 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.221099 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.221452 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.221550 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8" (OuterVolumeSpecName: "kube-api-access-fptm8") pod "69f0575b-205f-45e8-9dc2-6dc9a58677b3" (UID: "69f0575b-205f-45e8-9dc2-6dc9a58677b3"). InnerVolumeSpecName "kube-api-access-fptm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320706 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320774 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fptm8\" (UniqueName: \"kubernetes.io/projected/69f0575b-205f-45e8-9dc2-6dc9a58677b3-kube-api-access-fptm8\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320795 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320814 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/69f0575b-205f-45e8-9dc2-6dc9a58677b3-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.320834 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/69f0575b-205f-45e8-9dc2-6dc9a58677b3-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.805157 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_69f0575b-205f-45e8-9dc2-6dc9a58677b3/docker-build/0.log" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.805561 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.805517 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"69f0575b-205f-45e8-9dc2-6dc9a58677b3","Type":"ContainerDied","Data":"d81d42440ab00778c86f6fcb4c62ec165af109b644786779b55c38f5d05a2e58"} Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.805894 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d81d42440ab00778c86f6fcb4c62ec165af109b644786779b55c38f5d05a2e58" Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.808472 4774 generic.go:334] "Generic (PLEG): container finished" podID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerID="88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0" exitCode=0 Jan 27 00:30:24 crc kubenswrapper[4774]: I0127 00:30:24.808504 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerDied","Data":"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0"} Jan 27 00:30:25 crc kubenswrapper[4774]: I0127 00:30:25.817770 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerStarted","Data":"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d"} Jan 27 00:30:25 crc kubenswrapper[4774]: I0127 00:30:25.839002 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4hn7v" podStartSLOduration=3.282519349 podStartE2EDuration="5.838974781s" podCreationTimestamp="2026-01-27 00:30:20 +0000 UTC" firstStartedPulling="2026-01-27 00:30:22.786445697 +0000 UTC m=+1401.092222601" lastFinishedPulling="2026-01-27 00:30:25.342901149 +0000 UTC m=+1403.648678033" observedRunningTime="2026-01-27 00:30:25.833105274 +0000 UTC m=+1404.138882188" watchObservedRunningTime="2026-01-27 00:30:25.838974781 +0000 UTC m=+1404.144751665" Jan 27 00:30:30 crc kubenswrapper[4774]: I0127 00:30:30.649943 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:30 crc kubenswrapper[4774]: I0127 00:30:30.657536 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.331905 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.331992 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.388791 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.914568 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:31 crc kubenswrapper[4774]: I0127 00:30:31.965015 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.330003 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Jan 27 00:30:32 crc kubenswrapper[4774]: E0127 00:30:32.330399 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="manage-dockerfile" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.330430 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="manage-dockerfile" Jan 27 00:30:32 crc kubenswrapper[4774]: E0127 00:30:32.330478 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="docker-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.330493 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="docker-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.330692 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" containerName="docker-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.335673 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.337838 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338247 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338277 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338296 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338396 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338389 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338492 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338638 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338758 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338802 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338834 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338877 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.338926 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.339019 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.340034 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.356364 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.366497 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f0575b-205f-45e8-9dc2-6dc9a58677b3" path="/var/lib/kubelet/pods/69f0575b-205f-45e8-9dc2-6dc9a58677b3/volumes" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439724 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439771 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439793 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439850 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439879 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439898 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440154 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.439915 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440285 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440416 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440528 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.440596 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.441749 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.441879 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.441952 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.442045 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.442271 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.442289 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.442463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.446662 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.446754 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.458541 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.655277 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:32 crc kubenswrapper[4774]: I0127 00:30:32.898162 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Jan 27 00:30:33 crc kubenswrapper[4774]: I0127 00:30:33.878077 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerStarted","Data":"805fe66c3ab78445d712fcea5389d82d0f6bf329d62105c8d671658dfa8da638"} Jan 27 00:30:33 crc kubenswrapper[4774]: I0127 00:30:33.878415 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerStarted","Data":"2d2f2297cd3f2b3bd4e7f55c45acb60684ffc1e32ed0278ab3f333601623b81f"} Jan 27 00:30:33 crc kubenswrapper[4774]: I0127 00:30:33.878269 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4hn7v" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="registry-server" containerID="cri-o://fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" gracePeriod=2 Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.249065 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.267184 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") pod \"01f2f66a-e5f3-4fa4-922f-048d076eb189\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.267289 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") pod \"01f2f66a-e5f3-4fa4-922f-048d076eb189\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.267440 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") pod \"01f2f66a-e5f3-4fa4-922f-048d076eb189\" (UID: \"01f2f66a-e5f3-4fa4-922f-048d076eb189\") " Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.268058 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities" (OuterVolumeSpecName: "utilities") pod "01f2f66a-e5f3-4fa4-922f-048d076eb189" (UID: "01f2f66a-e5f3-4fa4-922f-048d076eb189"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.274559 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp" (OuterVolumeSpecName: "kube-api-access-6tvvp") pod "01f2f66a-e5f3-4fa4-922f-048d076eb189" (UID: "01f2f66a-e5f3-4fa4-922f-048d076eb189"). InnerVolumeSpecName "kube-api-access-6tvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.292454 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tvvp\" (UniqueName: \"kubernetes.io/projected/01f2f66a-e5f3-4fa4-922f-048d076eb189-kube-api-access-6tvvp\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.292508 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.344293 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01f2f66a-e5f3-4fa4-922f-048d076eb189" (UID: "01f2f66a-e5f3-4fa4-922f-048d076eb189"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.394308 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01f2f66a-e5f3-4fa4-922f-048d076eb189-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886406 4774 generic.go:334] "Generic (PLEG): container finished" podID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerID="fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" exitCode=0 Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886489 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerDied","Data":"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d"} Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886517 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4hn7v" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886542 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4hn7v" event={"ID":"01f2f66a-e5f3-4fa4-922f-048d076eb189","Type":"ContainerDied","Data":"1d14e5556a15340dd3b0becb619b1db4a8136052b768e1c956c0feb6736d4ddb"} Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.886578 4774 scope.go:117] "RemoveContainer" containerID="fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.888114 4774 generic.go:334] "Generic (PLEG): container finished" podID="a9452159-5284-43d8-a321-ddb59019b55d" containerID="805fe66c3ab78445d712fcea5389d82d0f6bf329d62105c8d671658dfa8da638" exitCode=0 Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.888177 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerDied","Data":"805fe66c3ab78445d712fcea5389d82d0f6bf329d62105c8d671658dfa8da638"} Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.932456 4774 scope.go:117] "RemoveContainer" containerID="88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.978026 4774 scope.go:117] "RemoveContainer" containerID="3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023" Jan 27 00:30:34 crc kubenswrapper[4774]: I0127 00:30:34.992362 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.000845 4774 scope.go:117] "RemoveContainer" containerID="fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" Jan 27 00:30:35 crc kubenswrapper[4774]: E0127 00:30:35.001516 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d\": container with ID starting with fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d not found: ID does not exist" containerID="fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.001681 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d"} err="failed to get container status \"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d\": rpc error: code = NotFound desc = could not find container \"fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d\": container with ID starting with fb0fc3b1defc289c731a731d4fb1065a5b6006e019eb5ab6d16733d59ea3df1d not found: ID does not exist" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.001769 4774 scope.go:117] "RemoveContainer" containerID="88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0" Jan 27 00:30:35 crc kubenswrapper[4774]: E0127 00:30:35.002364 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0\": container with ID starting with 88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0 not found: ID does not exist" containerID="88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.002426 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0"} err="failed to get container status \"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0\": rpc error: code = NotFound desc = could not find container \"88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0\": container with ID starting with 88d9b7e9096e123047a930540a22638c3ac03b605de00077f935f6f5cd3eb7a0 not found: ID does not exist" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.002449 4774 scope.go:117] "RemoveContainer" containerID="3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023" Jan 27 00:30:35 crc kubenswrapper[4774]: E0127 00:30:35.002936 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023\": container with ID starting with 3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023 not found: ID does not exist" containerID="3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.003016 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023"} err="failed to get container status \"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023\": rpc error: code = NotFound desc = could not find container \"3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023\": container with ID starting with 3f4a266e560f240161e40201c022889d44b06412da3690889763029aa5a43023 not found: ID does not exist" Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.010890 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4hn7v"] Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.898919 4774 generic.go:334] "Generic (PLEG): container finished" podID="a9452159-5284-43d8-a321-ddb59019b55d" containerID="7ac4abecb15fc0a8b695048482910c50ba464b35c70e3a489852335144587a88" exitCode=0 Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.899112 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerDied","Data":"7ac4abecb15fc0a8b695048482910c50ba464b35c70e3a489852335144587a88"} Jan 27 00:30:35 crc kubenswrapper[4774]: I0127 00:30:35.944554 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_a9452159-5284-43d8-a321-ddb59019b55d/manage-dockerfile/0.log" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.364500 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" path="/var/lib/kubelet/pods/01f2f66a-e5f3-4fa4-922f-048d076eb189/volumes" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.676059 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.676470 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.676521 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.677287 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.677350 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5" gracePeriod=600 Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.913010 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5" exitCode=0 Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.913089 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5"} Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.913187 4774 scope.go:117] "RemoveContainer" containerID="7685c19213d51fa9221db2865a3a407e305a658c67b584f4201953f8284c60cb" Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.917413 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerStarted","Data":"151be50c8b2cb5c05bfac6a97d7a3b29597ca3689c36ab5dc3b127cd32f2dcfd"} Jan 27 00:30:36 crc kubenswrapper[4774]: I0127 00:30:36.972362 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.972330468 podStartE2EDuration="4.972330468s" podCreationTimestamp="2026-01-27 00:30:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:30:36.964602061 +0000 UTC m=+1415.270378955" watchObservedRunningTime="2026-01-27 00:30:36.972330468 +0000 UTC m=+1415.278107362" Jan 27 00:30:37 crc kubenswrapper[4774]: I0127 00:30:37.928221 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541"} Jan 27 00:30:39 crc kubenswrapper[4774]: I0127 00:30:39.964132 4774 generic.go:334] "Generic (PLEG): container finished" podID="a9452159-5284-43d8-a321-ddb59019b55d" containerID="151be50c8b2cb5c05bfac6a97d7a3b29597ca3689c36ab5dc3b127cd32f2dcfd" exitCode=0 Jan 27 00:30:39 crc kubenswrapper[4774]: I0127 00:30:39.964229 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerDied","Data":"151be50c8b2cb5c05bfac6a97d7a3b29597ca3689c36ab5dc3b127cd32f2dcfd"} Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.218293 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266120 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266205 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266264 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266354 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266418 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266464 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266513 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266552 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266584 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266618 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266649 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.266694 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") pod \"a9452159-5284-43d8-a321-ddb59019b55d\" (UID: \"a9452159-5284-43d8-a321-ddb59019b55d\") " Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267221 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267283 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267440 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.267752 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.268275 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.273517 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.274160 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.274340 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.274627 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.275177 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr" (OuterVolumeSpecName: "kube-api-access-nltjr") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "kube-api-access-nltjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.276138 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a9452159-5284-43d8-a321-ddb59019b55d" (UID: "a9452159-5284-43d8-a321-ddb59019b55d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.368535 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369094 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369166 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369225 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369283 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369335 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369393 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369450 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9452159-5284-43d8-a321-ddb59019b55d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369502 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nltjr\" (UniqueName: \"kubernetes.io/projected/a9452159-5284-43d8-a321-ddb59019b55d-kube-api-access-nltjr\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369554 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/a9452159-5284-43d8-a321-ddb59019b55d-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369607 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9452159-5284-43d8-a321-ddb59019b55d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.369705 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a9452159-5284-43d8-a321-ddb59019b55d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.981927 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"a9452159-5284-43d8-a321-ddb59019b55d","Type":"ContainerDied","Data":"2d2f2297cd3f2b3bd4e7f55c45acb60684ffc1e32ed0278ab3f333601623b81f"} Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.981973 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d2f2297cd3f2b3bd4e7f55c45acb60684ffc1e32ed0278ab3f333601623b81f" Jan 27 00:30:41 crc kubenswrapper[4774]: I0127 00:30:41.982008 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.313467 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314081 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="registry-server" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314097 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="registry-server" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314107 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="docker-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314114 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="docker-build" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314127 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="manage-dockerfile" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314135 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="manage-dockerfile" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314147 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="extract-utilities" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314155 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="extract-utilities" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314178 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="extract-content" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314186 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="extract-content" Jan 27 00:30:45 crc kubenswrapper[4774]: E0127 00:30:45.314196 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="git-clone" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314206 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="git-clone" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314328 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9452159-5284-43d8-a321-ddb59019b55d" containerName="docker-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.314349 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f2f66a-e5f3-4fa4-922f-048d076eb189" containerName="registry-server" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.315187 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.320571 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.320571 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.320784 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.320937 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.334876 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436310 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436341 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436369 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436387 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436403 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436427 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436449 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436468 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436486 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436504 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.436535 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.537999 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538110 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538162 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538237 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538288 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538312 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538338 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538365 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538401 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.538433 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.539460 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.539680 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.539930 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.540105 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.540524 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.541150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.541203 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.541374 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.541739 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.547703 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.547744 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.564205 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:45 crc kubenswrapper[4774]: I0127 00:30:45.633892 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:46 crc kubenswrapper[4774]: I0127 00:30:46.116338 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:47 crc kubenswrapper[4774]: I0127 00:30:47.042312 4774 generic.go:334] "Generic (PLEG): container finished" podID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerID="14336beb86026bc90d2a9fb6311a3227ae7133bc262a6b935592389475831309" exitCode=0 Jan 27 00:30:47 crc kubenswrapper[4774]: I0127 00:30:47.042432 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"92a02e6d-fee7-4a15-b923-7da2262b3324","Type":"ContainerDied","Data":"14336beb86026bc90d2a9fb6311a3227ae7133bc262a6b935592389475831309"} Jan 27 00:30:47 crc kubenswrapper[4774]: I0127 00:30:47.044392 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"92a02e6d-fee7-4a15-b923-7da2262b3324","Type":"ContainerStarted","Data":"1e33de5c76cbf6c9b219f3d4baa1dfaea96de503848239d1c8ca6f198d8bc8a5"} Jan 27 00:30:48 crc kubenswrapper[4774]: I0127 00:30:48.055104 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_92a02e6d-fee7-4a15-b923-7da2262b3324/docker-build/0.log" Jan 27 00:30:48 crc kubenswrapper[4774]: I0127 00:30:48.057396 4774 generic.go:334] "Generic (PLEG): container finished" podID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerID="49b5f230df2b7f05eeb23873ef8a668173f6d60ab2c8dbad4bbe6685a4aa6248" exitCode=1 Jan 27 00:30:48 crc kubenswrapper[4774]: I0127 00:30:48.057451 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"92a02e6d-fee7-4a15-b923-7da2262b3324","Type":"ContainerDied","Data":"49b5f230df2b7f05eeb23873ef8a668173f6d60ab2c8dbad4bbe6685a4aa6248"} Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.334544 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_92a02e6d-fee7-4a15-b923-7da2262b3324/docker-build/0.log" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.335317 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499350 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499482 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499519 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499586 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499621 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499673 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499742 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499788 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499816 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499843 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499893 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.499931 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") pod \"92a02e6d-fee7-4a15-b923-7da2262b3324\" (UID: \"92a02e6d-fee7-4a15-b923-7da2262b3324\") " Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.501812 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502027 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502385 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502601 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502741 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.502771 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.501805 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.503125 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.503332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.504473 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.504557 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt" (OuterVolumeSpecName: "kube-api-access-x4mxt") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "kube-api-access-x4mxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.505179 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "92a02e6d-fee7-4a15-b923-7da2262b3324" (UID: "92a02e6d-fee7-4a15-b923-7da2262b3324"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601443 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601507 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601526 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601549 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601566 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601578 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/92a02e6d-fee7-4a15-b923-7da2262b3324-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601590 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601603 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601614 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4mxt\" (UniqueName: \"kubernetes.io/projected/92a02e6d-fee7-4a15-b923-7da2262b3324-kube-api-access-x4mxt\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601627 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/92a02e6d-fee7-4a15-b923-7da2262b3324-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601639 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92a02e6d-fee7-4a15-b923-7da2262b3324-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:49 crc kubenswrapper[4774]: I0127 00:30:49.601651 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/92a02e6d-fee7-4a15-b923-7da2262b3324-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:30:50 crc kubenswrapper[4774]: I0127 00:30:50.071680 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_92a02e6d-fee7-4a15-b923-7da2262b3324/docker-build/0.log" Jan 27 00:30:50 crc kubenswrapper[4774]: I0127 00:30:50.072158 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"92a02e6d-fee7-4a15-b923-7da2262b3324","Type":"ContainerDied","Data":"1e33de5c76cbf6c9b219f3d4baa1dfaea96de503848239d1c8ca6f198d8bc8a5"} Jan 27 00:30:50 crc kubenswrapper[4774]: I0127 00:30:50.072199 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e33de5c76cbf6c9b219f3d4baa1dfaea96de503848239d1c8ca6f198d8bc8a5" Jan 27 00:30:50 crc kubenswrapper[4774]: I0127 00:30:50.072238 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Jan 27 00:30:55 crc kubenswrapper[4774]: I0127 00:30:55.772157 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:55 crc kubenswrapper[4774]: I0127 00:30:55.781825 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Jan 27 00:30:56 crc kubenswrapper[4774]: I0127 00:30:56.372590 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" path="/var/lib/kubelet/pods/92a02e6d-fee7-4a15-b923-7da2262b3324/volumes" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.486977 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Jan 27 00:30:57 crc kubenswrapper[4774]: E0127 00:30:57.487364 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="manage-dockerfile" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.487386 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="manage-dockerfile" Jan 27 00:30:57 crc kubenswrapper[4774]: E0127 00:30:57.487405 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="docker-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.487416 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="docker-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.487581 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a02e6d-fee7-4a15-b923-7da2262b3324" containerName="docker-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.488960 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.492793 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.492814 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.493384 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.498693 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545393 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545466 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545537 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545600 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545626 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545669 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545697 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545737 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545826 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545928 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545953 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.545976 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.548428 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.646920 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.646980 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647000 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647031 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647053 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647101 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647132 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647153 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647179 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647202 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647224 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647448 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647539 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.647594 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.648099 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.648207 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.649018 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.649132 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.649066 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.649317 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.656271 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.656285 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.678851 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:57 crc kubenswrapper[4774]: I0127 00:30:57.810971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:30:58 crc kubenswrapper[4774]: I0127 00:30:58.258220 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Jan 27 00:30:58 crc kubenswrapper[4774]: W0127 00:30:58.268229 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcff99b3_b2b8_4a05_b19c_1498f32af54f.slice/crio-cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b WatchSource:0}: Error finding container cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b: Status 404 returned error can't find the container with id cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b Jan 27 00:30:59 crc kubenswrapper[4774]: I0127 00:30:59.140746 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerStarted","Data":"2b5d96bc668cc1b8799c7e981e3b41a835d8526d3b481d954264c13d6e5699d3"} Jan 27 00:30:59 crc kubenswrapper[4774]: I0127 00:30:59.141089 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerStarted","Data":"cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b"} Jan 27 00:31:00 crc kubenswrapper[4774]: I0127 00:31:00.153852 4774 generic.go:334] "Generic (PLEG): container finished" podID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerID="2b5d96bc668cc1b8799c7e981e3b41a835d8526d3b481d954264c13d6e5699d3" exitCode=0 Jan 27 00:31:00 crc kubenswrapper[4774]: I0127 00:31:00.153922 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerDied","Data":"2b5d96bc668cc1b8799c7e981e3b41a835d8526d3b481d954264c13d6e5699d3"} Jan 27 00:31:01 crc kubenswrapper[4774]: I0127 00:31:01.167053 4774 generic.go:334] "Generic (PLEG): container finished" podID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerID="45e593b1bf5a453c9b422463098e79f573320f2853ae4259742d4030ee049132" exitCode=0 Jan 27 00:31:01 crc kubenswrapper[4774]: I0127 00:31:01.167138 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerDied","Data":"45e593b1bf5a453c9b422463098e79f573320f2853ae4259742d4030ee049132"} Jan 27 00:31:01 crc kubenswrapper[4774]: I0127 00:31:01.217181 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_fcff99b3-b2b8-4a05-b19c-1498f32af54f/manage-dockerfile/0.log" Jan 27 00:31:02 crc kubenswrapper[4774]: I0127 00:31:02.179218 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerStarted","Data":"32bed463a3ccb3de624f33f4d2d5399fb499d0c44ef103015a08bbc48c8cae78"} Jan 27 00:31:02 crc kubenswrapper[4774]: I0127 00:31:02.216883 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.216827574 podStartE2EDuration="5.216827574s" podCreationTimestamp="2026-01-27 00:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:31:02.213398721 +0000 UTC m=+1440.519175645" watchObservedRunningTime="2026-01-27 00:31:02.216827574 +0000 UTC m=+1440.522604488" Jan 27 00:31:05 crc kubenswrapper[4774]: I0127 00:31:05.204674 4774 generic.go:334] "Generic (PLEG): container finished" podID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerID="32bed463a3ccb3de624f33f4d2d5399fb499d0c44ef103015a08bbc48c8cae78" exitCode=0 Jan 27 00:31:05 crc kubenswrapper[4774]: I0127 00:31:05.205664 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerDied","Data":"32bed463a3ccb3de624f33f4d2d5399fb499d0c44ef103015a08bbc48c8cae78"} Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.555317 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690252 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690385 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690422 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690455 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690493 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690545 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690603 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690677 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690738 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690774 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690815 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690816 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.690929 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") pod \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\" (UID: \"fcff99b3-b2b8-4a05-b19c-1498f32af54f\") " Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691337 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691388 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691609 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691646 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.691686 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.692372 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.692397 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.693695 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.696837 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.697584 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.698576 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl" (OuterVolumeSpecName: "kube-api-access-r4mrl") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "kube-api-access-r4mrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.701423 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "fcff99b3-b2b8-4a05-b19c-1498f32af54f" (UID: "fcff99b3-b2b8-4a05-b19c-1498f32af54f"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793142 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4mrl\" (UniqueName: \"kubernetes.io/projected/fcff99b3-b2b8-4a05-b19c-1498f32af54f-kube-api-access-r4mrl\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793193 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793212 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793229 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fcff99b3-b2b8-4a05-b19c-1498f32af54f-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793247 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793262 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793278 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793295 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/fcff99b3-b2b8-4a05-b19c-1498f32af54f-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793311 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793329 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fcff99b3-b2b8-4a05-b19c-1498f32af54f-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:06 crc kubenswrapper[4774]: I0127 00:31:06.793344 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/fcff99b3-b2b8-4a05-b19c-1498f32af54f-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:07 crc kubenswrapper[4774]: I0127 00:31:07.239346 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"fcff99b3-b2b8-4a05-b19c-1498f32af54f","Type":"ContainerDied","Data":"cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b"} Jan 27 00:31:07 crc kubenswrapper[4774]: I0127 00:31:07.239424 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc68bf99684cb5ff76d5de16ab39bad2d5f93a075e515ad4e59da8fe8111446b" Jan 27 00:31:07 crc kubenswrapper[4774]: I0127 00:31:07.239573 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.115562 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 27 00:31:24 crc kubenswrapper[4774]: E0127 00:31:24.116889 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="manage-dockerfile" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.116911 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="manage-dockerfile" Jan 27 00:31:24 crc kubenswrapper[4774]: E0127 00:31:24.116930 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="git-clone" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.116944 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="git-clone" Jan 27 00:31:24 crc kubenswrapper[4774]: E0127 00:31:24.116964 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="docker-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.116975 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="docker-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.117145 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcff99b3-b2b8-4a05-b19c-1498f32af54f" containerName="docker-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.119019 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.125438 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.125674 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.125847 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.126430 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-8h4m5" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.128281 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.153193 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271254 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271384 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271478 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271512 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271539 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271567 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271610 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271641 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271671 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271838 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271891 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.271916 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.374987 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375077 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375127 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375163 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375210 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375285 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375322 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375389 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375422 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375448 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375479 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375529 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.375589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.376266 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.376529 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.381464 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382126 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382494 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382522 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382574 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.382839 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.383028 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.387785 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.387934 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.388337 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.409166 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") pod \"service-telemetry-framework-index-1-build\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.443725 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:24 crc kubenswrapper[4774]: I0127 00:31:24.786414 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Jan 27 00:31:25 crc kubenswrapper[4774]: I0127 00:31:25.399456 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerStarted","Data":"0aac3241d5cb55fd33c30eb348f2d910363507d410c6e427004e9f23f1c35deb"} Jan 27 00:31:25 crc kubenswrapper[4774]: I0127 00:31:25.399984 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerStarted","Data":"2535846c7c29096f9e9f5c73067f9cff2bd19eb11c98df80dd4d6a08c09a677e"} Jan 27 00:31:26 crc kubenswrapper[4774]: I0127 00:31:26.410440 4774 generic.go:334] "Generic (PLEG): container finished" podID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerID="0aac3241d5cb55fd33c30eb348f2d910363507d410c6e427004e9f23f1c35deb" exitCode=0 Jan 27 00:31:26 crc kubenswrapper[4774]: I0127 00:31:26.410522 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerDied","Data":"0aac3241d5cb55fd33c30eb348f2d910363507d410c6e427004e9f23f1c35deb"} Jan 27 00:31:27 crc kubenswrapper[4774]: I0127 00:31:27.423705 4774 generic.go:334] "Generic (PLEG): container finished" podID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerID="288eefec81b0e990905f2d152c7dee1754da7b88b63cb0100e9830b4ad22e8da" exitCode=0 Jan 27 00:31:27 crc kubenswrapper[4774]: I0127 00:31:27.423769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerDied","Data":"288eefec81b0e990905f2d152c7dee1754da7b88b63cb0100e9830b4ad22e8da"} Jan 27 00:31:27 crc kubenswrapper[4774]: I0127 00:31:27.473599 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d/manage-dockerfile/0.log" Jan 27 00:31:28 crc kubenswrapper[4774]: I0127 00:31:28.434822 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerStarted","Data":"8e0c1422d3f12ac19f4457cba5615e0ac867694b08ef9415ad1cd3f3c7dbeda8"} Jan 27 00:31:28 crc kubenswrapper[4774]: I0127 00:31:28.499109 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=4.499076191 podStartE2EDuration="4.499076191s" podCreationTimestamp="2026-01-27 00:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:31:28.47328875 +0000 UTC m=+1466.779065644" watchObservedRunningTime="2026-01-27 00:31:28.499076191 +0000 UTC m=+1466.804853095" Jan 27 00:31:57 crc kubenswrapper[4774]: I0127 00:31:57.652161 4774 generic.go:334] "Generic (PLEG): container finished" podID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerID="8e0c1422d3f12ac19f4457cba5615e0ac867694b08ef9415ad1cd3f3c7dbeda8" exitCode=0 Jan 27 00:31:57 crc kubenswrapper[4774]: I0127 00:31:57.652274 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerDied","Data":"8e0c1422d3f12ac19f4457cba5615e0ac867694b08ef9415ad1cd3f3c7dbeda8"} Jan 27 00:31:58 crc kubenswrapper[4774]: I0127 00:31:58.942041 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058634 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058706 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058736 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058794 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058845 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058902 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058932 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.058991 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059023 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059047 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059082 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059119 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059176 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") pod \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\" (UID: \"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d\") " Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059557 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.059985 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.060731 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.060808 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.061005 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.060998 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.061316 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.067906 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-pull") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "builder-dockercfg-8h4m5-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.069564 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8" (OuterVolumeSpecName: "kube-api-access-f9cf8") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "kube-api-access-f9cf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.069639 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.071116 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push" (OuterVolumeSpecName: "builder-dockercfg-8h4m5-push") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "builder-dockercfg-8h4m5-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161313 4774 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161377 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161401 4774 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161419 4774 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161437 4774 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161454 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-push\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-push\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161472 4774 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161490 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9cf8\" (UniqueName: \"kubernetes.io/projected/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-kube-api-access-f9cf8\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161508 4774 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161528 4774 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-8h4m5-pull\" (UniqueName: \"kubernetes.io/secret/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-builder-dockercfg-8h4m5-pull\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.161591 4774 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.265213 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.365171 4774 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.676158 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d","Type":"ContainerDied","Data":"2535846c7c29096f9e9f5c73067f9cff2bd19eb11c98df80dd4d6a08c09a677e"} Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.676206 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2535846c7c29096f9e9f5c73067f9cff2bd19eb11c98df80dd4d6a08c09a677e" Jan 27 00:31:59 crc kubenswrapper[4774]: I0127 00:31:59.676299 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.426528 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" (UID: "d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.485083 4774 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.852935 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:00 crc kubenswrapper[4774]: E0127 00:32:00.853678 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="docker-build" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.853723 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="docker-build" Jan 27 00:32:00 crc kubenswrapper[4774]: E0127 00:32:00.853759 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="manage-dockerfile" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.853779 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="manage-dockerfile" Jan 27 00:32:00 crc kubenswrapper[4774]: E0127 00:32:00.853825 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="git-clone" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.853843 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="git-clone" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.854122 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f3ae4e-c07e-4fe2-b1d5-90614ebfb74d" containerName="docker-build" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.855153 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.856421 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.860399 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-lx54v" Jan 27 00:32:00 crc kubenswrapper[4774]: I0127 00:32:00.992029 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") pod \"infrawatch-operators-vwmhk\" (UID: \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\") " pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.094020 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") pod \"infrawatch-operators-vwmhk\" (UID: \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\") " pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.111056 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") pod \"infrawatch-operators-vwmhk\" (UID: \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\") " pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.174761 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.432443 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:01 crc kubenswrapper[4774]: W0127 00:32:01.438037 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcbe95ca_c88b_4959_ba19_a9f1fb1bbfd8.slice/crio-b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319 WatchSource:0}: Error finding container b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319: Status 404 returned error can't find the container with id b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319 Jan 27 00:32:01 crc kubenswrapper[4774]: I0127 00:32:01.699602 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-vwmhk" event={"ID":"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8","Type":"ContainerStarted","Data":"b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319"} Jan 27 00:32:05 crc kubenswrapper[4774]: I0127 00:32:05.226267 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.033990 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-cn27j"] Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.035159 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.048563 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-cn27j"] Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.070438 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh4mw\" (UniqueName: \"kubernetes.io/projected/59e04641-33ea-4c4e-8ebd-8cf84728cd95-kube-api-access-kh4mw\") pod \"infrawatch-operators-cn27j\" (UID: \"59e04641-33ea-4c4e-8ebd-8cf84728cd95\") " pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.172647 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh4mw\" (UniqueName: \"kubernetes.io/projected/59e04641-33ea-4c4e-8ebd-8cf84728cd95-kube-api-access-kh4mw\") pod \"infrawatch-operators-cn27j\" (UID: \"59e04641-33ea-4c4e-8ebd-8cf84728cd95\") " pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.196125 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh4mw\" (UniqueName: \"kubernetes.io/projected/59e04641-33ea-4c4e-8ebd-8cf84728cd95-kube-api-access-kh4mw\") pod \"infrawatch-operators-cn27j\" (UID: \"59e04641-33ea-4c4e-8ebd-8cf84728cd95\") " pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:06 crc kubenswrapper[4774]: I0127 00:32:06.355552 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.111849 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-cn27j"] Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.788552 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-cn27j" event={"ID":"59e04641-33ea-4c4e-8ebd-8cf84728cd95","Type":"ContainerStarted","Data":"4ed4e3fb8816316d4ac7d9c7d093d5f695a4ef5a205546d613809d8ff377ffdc"} Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.788891 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-cn27j" event={"ID":"59e04641-33ea-4c4e-8ebd-8cf84728cd95","Type":"ContainerStarted","Data":"c6e03c1650e7688d361558deb96a907e1ae5d0c30ce51b720fa51b06dc2237f4"} Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.790651 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-vwmhk" event={"ID":"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8","Type":"ContainerStarted","Data":"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299"} Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.790851 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-vwmhk" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerName="registry-server" containerID="cri-o://5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" gracePeriod=2 Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.814501 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-cn27j" podStartSLOduration=6.717040112 podStartE2EDuration="6.814472723s" podCreationTimestamp="2026-01-27 00:32:06 +0000 UTC" firstStartedPulling="2026-01-27 00:32:12.117024335 +0000 UTC m=+1510.422801229" lastFinishedPulling="2026-01-27 00:32:12.214456956 +0000 UTC m=+1510.520233840" observedRunningTime="2026-01-27 00:32:12.812827999 +0000 UTC m=+1511.118604893" watchObservedRunningTime="2026-01-27 00:32:12.814472723 +0000 UTC m=+1511.120249627" Jan 27 00:32:12 crc kubenswrapper[4774]: I0127 00:32:12.832003 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-vwmhk" podStartSLOduration=2.41902672 podStartE2EDuration="12.831979373s" podCreationTimestamp="2026-01-27 00:32:00 +0000 UTC" firstStartedPulling="2026-01-27 00:32:01.440203151 +0000 UTC m=+1499.745980035" lastFinishedPulling="2026-01-27 00:32:11.853155784 +0000 UTC m=+1510.158932688" observedRunningTime="2026-01-27 00:32:12.827340068 +0000 UTC m=+1511.133116952" watchObservedRunningTime="2026-01-27 00:32:12.831979373 +0000 UTC m=+1511.137756267" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.185569 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.327353 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") pod \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\" (UID: \"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8\") " Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.332361 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl" (OuterVolumeSpecName: "kube-api-access-j7kgl") pod "bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" (UID: "bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8"). InnerVolumeSpecName "kube-api-access-j7kgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.429181 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7kgl\" (UniqueName: \"kubernetes.io/projected/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8-kube-api-access-j7kgl\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.803942 4774 generic.go:334] "Generic (PLEG): container finished" podID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerID="5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" exitCode=0 Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.804001 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-vwmhk" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.804018 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-vwmhk" event={"ID":"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8","Type":"ContainerDied","Data":"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299"} Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.804102 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-vwmhk" event={"ID":"bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8","Type":"ContainerDied","Data":"b2b329d8dc00ef22ffe5b0d5450c51e0ecc82d260d99e775a032106f82924319"} Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.804181 4774 scope.go:117] "RemoveContainer" containerID="5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.825780 4774 scope.go:117] "RemoveContainer" containerID="5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" Jan 27 00:32:13 crc kubenswrapper[4774]: E0127 00:32:13.826299 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299\": container with ID starting with 5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299 not found: ID does not exist" containerID="5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.826344 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299"} err="failed to get container status \"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299\": rpc error: code = NotFound desc = could not find container \"5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299\": container with ID starting with 5f779f0d6d68e1f217dacaf454c0a236484e216dd05c3e60ae7b9bebe4ade299 not found: ID does not exist" Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.844175 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:13 crc kubenswrapper[4774]: I0127 00:32:13.849194 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-vwmhk"] Jan 27 00:32:14 crc kubenswrapper[4774]: I0127 00:32:14.365521 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" path="/var/lib/kubelet/pods/bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8/volumes" Jan 27 00:32:16 crc kubenswrapper[4774]: I0127 00:32:16.364537 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:16 crc kubenswrapper[4774]: I0127 00:32:16.365165 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:16 crc kubenswrapper[4774]: I0127 00:32:16.407108 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:26 crc kubenswrapper[4774]: I0127 00:32:26.405930 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-cn27j" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.710213 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7"] Jan 27 00:32:33 crc kubenswrapper[4774]: E0127 00:32:33.711592 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerName="registry-server" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.711615 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerName="registry-server" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.711833 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcbe95ca-c88b-4959-ba19-a9f1fb1bbfd8" containerName="registry-server" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.713302 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.735575 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7"] Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.833512 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.833636 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.833691 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.935616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.935673 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.935785 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.936359 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.936590 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:33 crc kubenswrapper[4774]: I0127 00:32:33.961937 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.055242 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.328200 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7"] Jan 27 00:32:34 crc kubenswrapper[4774]: W0127 00:32:34.333148 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc4a7d0e_83e6_4280_b61b_9741ebc0e7b8.slice/crio-817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97 WatchSource:0}: Error finding container 817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97: Status 404 returned error can't find the container with id 817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97 Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.583963 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6"] Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.585218 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.586434 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6"] Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.776635 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.777209 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.777260 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.878549 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.879102 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.879357 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.879509 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.879661 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.899947 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.901546 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.988770 4774 generic.go:334] "Generic (PLEG): container finished" podID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerID="f87228d204d05e2c7edfba62efba195ea1177f1d742fa63691cff26a62826ec8" exitCode=0 Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.988820 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerDied","Data":"f87228d204d05e2c7edfba62efba195ea1177f1d742fa63691cff26a62826ec8"} Jan 27 00:32:34 crc kubenswrapper[4774]: I0127 00:32:34.988853 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerStarted","Data":"817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97"} Jan 27 00:32:35 crc kubenswrapper[4774]: I0127 00:32:35.203520 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6"] Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:35.999959 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerID="7c2af4f30102be4e7a9548c15d0ad6af09b9f6809929ebe6feb135743669b8fd" exitCode=0 Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.000095 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerDied","Data":"7c2af4f30102be4e7a9548c15d0ad6af09b9f6809929ebe6feb135743669b8fd"} Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.000140 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerStarted","Data":"c5035b3849c2e389030163c442ab41df56f1602f87c1dae5841a6e2bd5012072"} Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.007636 4774 generic.go:334] "Generic (PLEG): container finished" podID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerID="f0e75a394cc094a5f4cec0ec7069884dbd9edda6ba29db3b1174a591ce8c3a8f" exitCode=0 Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.007718 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerDied","Data":"f0e75a394cc094a5f4cec0ec7069884dbd9edda6ba29db3b1174a591ce8c3a8f"} Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.675900 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:32:36 crc kubenswrapper[4774]: I0127 00:32:36.676007 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:32:37 crc kubenswrapper[4774]: I0127 00:32:37.022373 4774 generic.go:334] "Generic (PLEG): container finished" podID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerID="cb3a7f05e8d9c8671aff7b6c09b3c0b0699a1a069036af739ee21f3e6a9ba97b" exitCode=0 Jan 27 00:32:37 crc kubenswrapper[4774]: I0127 00:32:37.022452 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerDied","Data":"cb3a7f05e8d9c8671aff7b6c09b3c0b0699a1a069036af739ee21f3e6a9ba97b"} Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.032012 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerID="a2fe27398823d30032f627a91fc5c0a9fedb46b77683c07313850ad63a7b211c" exitCode=0 Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.032388 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerDied","Data":"a2fe27398823d30032f627a91fc5c0a9fedb46b77683c07313850ad63a7b211c"} Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.331075 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.336010 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") pod \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.336052 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") pod \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.336088 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") pod \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\" (UID: \"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8\") " Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.336815 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle" (OuterVolumeSpecName: "bundle") pod "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" (UID: "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.344116 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45" (OuterVolumeSpecName: "kube-api-access-n7d45") pod "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" (UID: "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8"). InnerVolumeSpecName "kube-api-access-n7d45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.369835 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util" (OuterVolumeSpecName: "util") pod "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" (UID: "bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.441609 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.441751 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:38 crc kubenswrapper[4774]: I0127 00:32:38.441783 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7d45\" (UniqueName: \"kubernetes.io/projected/bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8-kube-api-access-n7d45\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.046095 4774 generic.go:334] "Generic (PLEG): container finished" podID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerID="3f2e4169dbca4055ddabddcbc7c0c8c18ecded89bc78879bc04e58cb8423ee4a" exitCode=0 Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.046265 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerDied","Data":"3f2e4169dbca4055ddabddcbc7c0c8c18ecded89bc78879bc04e58cb8423ee4a"} Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.049396 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" event={"ID":"bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8","Type":"ContainerDied","Data":"817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97"} Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.049826 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="817a982558f2b31bcd3fad39c11f6abb57a2c240e45a7ee4d606bb41094cef97" Jan 27 00:32:39 crc kubenswrapper[4774]: I0127 00:32:39.049773 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65almvv7" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.434178 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.575736 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") pod \"ba2c0178-4760-4b77-990e-35b2bcd11729\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.576010 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") pod \"ba2c0178-4760-4b77-990e-35b2bcd11729\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.576227 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") pod \"ba2c0178-4760-4b77-990e-35b2bcd11729\" (UID: \"ba2c0178-4760-4b77-990e-35b2bcd11729\") " Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.577291 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle" (OuterVolumeSpecName: "bundle") pod "ba2c0178-4760-4b77-990e-35b2bcd11729" (UID: "ba2c0178-4760-4b77-990e-35b2bcd11729"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.584838 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m" (OuterVolumeSpecName: "kube-api-access-rkw2m") pod "ba2c0178-4760-4b77-990e-35b2bcd11729" (UID: "ba2c0178-4760-4b77-990e-35b2bcd11729"). InnerVolumeSpecName "kube-api-access-rkw2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.614606 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util" (OuterVolumeSpecName: "util") pod "ba2c0178-4760-4b77-990e-35b2bcd11729" (UID: "ba2c0178-4760-4b77-990e-35b2bcd11729"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.677968 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.678025 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba2c0178-4760-4b77-990e-35b2bcd11729-util\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:40 crc kubenswrapper[4774]: I0127 00:32:40.678046 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkw2m\" (UniqueName: \"kubernetes.io/projected/ba2c0178-4760-4b77-990e-35b2bcd11729-kube-api-access-rkw2m\") on node \"crc\" DevicePath \"\"" Jan 27 00:32:41 crc kubenswrapper[4774]: I0127 00:32:41.076712 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" event={"ID":"ba2c0178-4760-4b77-990e-35b2bcd11729","Type":"ContainerDied","Data":"c5035b3849c2e389030163c442ab41df56f1602f87c1dae5841a6e2bd5012072"} Jan 27 00:32:41 crc kubenswrapper[4774]: I0127 00:32:41.076808 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c094t6s6" Jan 27 00:32:41 crc kubenswrapper[4774]: I0127 00:32:41.076814 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5035b3849c2e389030163c442ab41df56f1602f87c1dae5841a6e2bd5012072" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.714797 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f"] Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715815 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="pull" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715829 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="pull" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715837 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="util" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715843 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="util" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715883 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715890 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715899 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="pull" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715905 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="pull" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715919 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="util" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715925 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="util" Jan 27 00:32:44 crc kubenswrapper[4774]: E0127 00:32:44.715942 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.715947 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.716044 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4a7d0e-83e6-4280-b61b-9741ebc0e7b8" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.716057 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2c0178-4760-4b77-990e-35b2bcd11729" containerName="extract" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.716491 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.718736 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-7q8nh" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.737291 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f"] Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.845001 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/61fa30c6-90bc-4b6c-b850-2b2e59506e08-runner\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.845104 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj27v\" (UniqueName: \"kubernetes.io/projected/61fa30c6-90bc-4b6c-b850-2b2e59506e08-kube-api-access-fj27v\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.946568 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/61fa30c6-90bc-4b6c-b850-2b2e59506e08-runner\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.946641 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj27v\" (UniqueName: \"kubernetes.io/projected/61fa30c6-90bc-4b6c-b850-2b2e59506e08-kube-api-access-fj27v\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.947136 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/61fa30c6-90bc-4b6c-b850-2b2e59506e08-runner\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:44 crc kubenswrapper[4774]: I0127 00:32:44.970555 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj27v\" (UniqueName: \"kubernetes.io/projected/61fa30c6-90bc-4b6c-b850-2b2e59506e08-kube-api-access-fj27v\") pod \"smart-gateway-operator-7b4c7b595f-78n7f\" (UID: \"61fa30c6-90bc-4b6c-b850-2b2e59506e08\") " pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:45 crc kubenswrapper[4774]: I0127 00:32:45.036738 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" Jan 27 00:32:45 crc kubenswrapper[4774]: I0127 00:32:45.339693 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f"] Jan 27 00:32:46 crc kubenswrapper[4774]: I0127 00:32:46.125443 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" event={"ID":"61fa30c6-90bc-4b6c-b850-2b2e59506e08","Type":"ContainerStarted","Data":"52e64aa0f0917d58a00bf3f233cb9c02240bdaf763b5e485dbb82ec246634fc2"} Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.577208 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-59f5866557-f7gmp"] Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.578902 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.584746 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-fp78c" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.604403 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-59f5866557-f7gmp"] Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.708025 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/14f2c5e3-3625-4889-97e4-38820ac84518-runner\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.708164 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbdvx\" (UniqueName: \"kubernetes.io/projected/14f2c5e3-3625-4889-97e4-38820ac84518-kube-api-access-pbdvx\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.809191 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/14f2c5e3-3625-4889-97e4-38820ac84518-runner\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.809284 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbdvx\" (UniqueName: \"kubernetes.io/projected/14f2c5e3-3625-4889-97e4-38820ac84518-kube-api-access-pbdvx\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.809835 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/14f2c5e3-3625-4889-97e4-38820ac84518-runner\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.838553 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbdvx\" (UniqueName: \"kubernetes.io/projected/14f2c5e3-3625-4889-97e4-38820ac84518-kube-api-access-pbdvx\") pod \"service-telemetry-operator-59f5866557-f7gmp\" (UID: \"14f2c5e3-3625-4889-97e4-38820ac84518\") " pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:47 crc kubenswrapper[4774]: I0127 00:32:47.909629 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" Jan 27 00:32:48 crc kubenswrapper[4774]: I0127 00:32:48.176602 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-59f5866557-f7gmp"] Jan 27 00:32:49 crc kubenswrapper[4774]: I0127 00:32:49.151348 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" event={"ID":"14f2c5e3-3625-4889-97e4-38820ac84518","Type":"ContainerStarted","Data":"680b7b1e6ae9ae46a899b28a8678425ab02fa8641a17baf4f97ccb6c15d7030d"} Jan 27 00:33:01 crc kubenswrapper[4774]: E0127 00:33:01.378562 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Jan 27 00:33:01 crc kubenswrapper[4774]: E0127 00:33:01.379545 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1769473811,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fj27v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-7b4c7b595f-78n7f_service-telemetry(61fa30c6-90bc-4b6c-b850-2b2e59506e08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 00:33:01 crc kubenswrapper[4774]: E0127 00:33:01.380768 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" podUID="61fa30c6-90bc-4b6c-b850-2b2e59506e08" Jan 27 00:33:02 crc kubenswrapper[4774]: E0127 00:33:02.267267 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" podUID="61fa30c6-90bc-4b6c-b850-2b2e59506e08" Jan 27 00:33:06 crc kubenswrapper[4774]: I0127 00:33:06.675296 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:33:06 crc kubenswrapper[4774]: I0127 00:33:06.676101 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:33:07 crc kubenswrapper[4774]: I0127 00:33:07.304805 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" event={"ID":"14f2c5e3-3625-4889-97e4-38820ac84518","Type":"ContainerStarted","Data":"d7afeb04dd0ffd0243691462d6091a886d24389b3989eb022072fb60c4475839"} Jan 27 00:33:07 crc kubenswrapper[4774]: I0127 00:33:07.343177 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-59f5866557-f7gmp" podStartSLOduration=2.000696099 podStartE2EDuration="20.343147009s" podCreationTimestamp="2026-01-27 00:32:47 +0000 UTC" firstStartedPulling="2026-01-27 00:32:48.199343626 +0000 UTC m=+1546.505120510" lastFinishedPulling="2026-01-27 00:33:06.541794536 +0000 UTC m=+1564.847571420" observedRunningTime="2026-01-27 00:33:07.339124091 +0000 UTC m=+1565.644901035" watchObservedRunningTime="2026-01-27 00:33:07.343147009 +0000 UTC m=+1565.648923923" Jan 27 00:33:17 crc kubenswrapper[4774]: I0127 00:33:17.408419 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" event={"ID":"61fa30c6-90bc-4b6c-b850-2b2e59506e08","Type":"ContainerStarted","Data":"edb20735273418fbc3f51d9924e60b74128a6d585bbbb04ccd7623fc08a5d45d"} Jan 27 00:33:17 crc kubenswrapper[4774]: I0127 00:33:17.433076 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-7b4c7b595f-78n7f" podStartSLOduration=1.940795568 podStartE2EDuration="33.433042655s" podCreationTimestamp="2026-01-27 00:32:44 +0000 UTC" firstStartedPulling="2026-01-27 00:32:45.349215055 +0000 UTC m=+1543.654991939" lastFinishedPulling="2026-01-27 00:33:16.841462142 +0000 UTC m=+1575.147239026" observedRunningTime="2026-01-27 00:33:17.430420204 +0000 UTC m=+1575.736197098" watchObservedRunningTime="2026-01-27 00:33:17.433042655 +0000 UTC m=+1575.738819549" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.066763 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.077006 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.083944 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.084400 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.084613 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.084774 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.085036 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.085609 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-cdfh6" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.085764 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.090190 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.202617 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203075 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203118 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203474 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203655 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203731 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.203793 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.305776 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.305869 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.305918 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.305961 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.306000 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.306065 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.306118 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.307463 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.321932 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.325751 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.326650 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.333411 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.333796 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.350766 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") pod \"default-interconnect-68864d46cb-b5bgt\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.411172 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:33:34 crc kubenswrapper[4774]: I0127 00:33:34.704683 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:33:35 crc kubenswrapper[4774]: I0127 00:33:35.597584 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" event={"ID":"2c8f3edd-38d2-4a50-add2-873dd1ac35e5","Type":"ContainerStarted","Data":"a5ba257ff6bd2d15abb20ec26902d31e3cb2085f1429dfba58273e45e14cd88c"} Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.675378 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.675954 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.676027 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.677271 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:33:36 crc kubenswrapper[4774]: I0127 00:33:36.677382 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" gracePeriod=600 Jan 27 00:33:36 crc kubenswrapper[4774]: E0127 00:33:36.820033 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:33:37 crc kubenswrapper[4774]: I0127 00:33:37.616574 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" exitCode=0 Jan 27 00:33:37 crc kubenswrapper[4774]: I0127 00:33:37.616628 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541"} Jan 27 00:33:37 crc kubenswrapper[4774]: I0127 00:33:37.616681 4774 scope.go:117] "RemoveContainer" containerID="1c53cb6c38911d466299f8dab8954ff32a3cd2ab17025a91e5e6eb03240440a5" Jan 27 00:33:37 crc kubenswrapper[4774]: I0127 00:33:37.617679 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:33:37 crc kubenswrapper[4774]: E0127 00:33:37.619349 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:33:40 crc kubenswrapper[4774]: I0127 00:33:40.644927 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" event={"ID":"2c8f3edd-38d2-4a50-add2-873dd1ac35e5","Type":"ContainerStarted","Data":"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f"} Jan 27 00:33:40 crc kubenswrapper[4774]: I0127 00:33:40.671286 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" podStartSLOduration=1.019683132 podStartE2EDuration="6.67126534s" podCreationTimestamp="2026-01-27 00:33:34 +0000 UTC" firstStartedPulling="2026-01-27 00:33:34.716476797 +0000 UTC m=+1593.022253681" lastFinishedPulling="2026-01-27 00:33:40.368059005 +0000 UTC m=+1598.673835889" observedRunningTime="2026-01-27 00:33:40.670315564 +0000 UTC m=+1598.976092488" watchObservedRunningTime="2026-01-27 00:33:40.67126534 +0000 UTC m=+1598.977042224" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.829200 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.832074 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.835186 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.835670 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.835798 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836059 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836403 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836440 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836755 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.836837 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.839130 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.840818 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-lqwbh" Jan 27 00:33:45 crc kubenswrapper[4774]: I0127 00:33:45.852475 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018474 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018540 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018565 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018593 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-tls-assets\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.018613 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhkc\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-kube-api-access-hnhkc\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019014 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71259922-2e93-4571-80c6-e054f4372056-config-out\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019445 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019541 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-web-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019595 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.019675 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.020142 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.020237 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122535 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122620 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-web-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122669 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122716 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122797 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: E0127 00:33:46.122827 4774 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Jan 27 00:33:46 crc kubenswrapper[4774]: E0127 00:33:46.122997 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls podName:71259922-2e93-4571-80c6-e054f4372056 nodeName:}" failed. No retries permitted until 2026-01-27 00:33:46.622956242 +0000 UTC m=+1604.928733306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "71259922-2e93-4571-80c6-e054f4372056") : secret "default-prometheus-proxy-tls" not found Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.122855 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123559 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123642 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123821 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-tls-assets\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.123887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhkc\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-kube-api-access-hnhkc\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.124030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71259922-2e93-4571-80c6-e054f4372056-config-out\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.124904 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.125070 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.125473 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.126885 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71259922-2e93-4571-80c6-e054f4372056-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.134578 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.136448 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.136800 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-tls-assets\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.143069 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-web-config\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.143243 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.143302 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/295e7c323f08c6637b248777cc2f8a161194fb17a8c00659f66f94d7e9196f62/globalmount\"" pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.156628 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/71259922-2e93-4571-80c6-e054f4372056-config-out\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.160446 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhkc\" (UniqueName: \"kubernetes.io/projected/71259922-2e93-4571-80c6-e054f4372056-kube-api-access-hnhkc\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.186430 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eebaa146-f219-4ffc-bee5-24f5200b6035\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: I0127 00:33:46.632225 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:46 crc kubenswrapper[4774]: E0127 00:33:46.632547 4774 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Jan 27 00:33:46 crc kubenswrapper[4774]: E0127 00:33:46.632694 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls podName:71259922-2e93-4571-80c6-e054f4372056 nodeName:}" failed. No retries permitted until 2026-01-27 00:33:47.632648509 +0000 UTC m=+1605.938425433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "71259922-2e93-4571-80c6-e054f4372056") : secret "default-prometheus-proxy-tls" not found Jan 27 00:33:47 crc kubenswrapper[4774]: I0127 00:33:47.647946 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:47 crc kubenswrapper[4774]: I0127 00:33:47.664633 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71259922-2e93-4571-80c6-e054f4372056-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"71259922-2e93-4571-80c6-e054f4372056\") " pod="service-telemetry/prometheus-default-0" Jan 27 00:33:47 crc kubenswrapper[4774]: I0127 00:33:47.953971 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Jan 27 00:33:48 crc kubenswrapper[4774]: I0127 00:33:48.285093 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Jan 27 00:33:48 crc kubenswrapper[4774]: I0127 00:33:48.714268 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"d6b041c8b3b87f133eeccbd1e07edb577775842dd494617252f6f3a9d37ad2a2"} Jan 27 00:33:50 crc kubenswrapper[4774]: I0127 00:33:50.357114 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:33:50 crc kubenswrapper[4774]: E0127 00:33:50.357416 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:33:52 crc kubenswrapper[4774]: I0127 00:33:52.748804 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"a1008d19697279cbd14e1da441adaa8c511db421d870ec17dd82ad24769c432c"} Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.005674 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-mzmt2"] Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.006912 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.021140 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-mzmt2"] Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.119691 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h78v\" (UniqueName: \"kubernetes.io/projected/60dce4ee-cdb6-4418-ac73-063f48c8be7e-kube-api-access-2h78v\") pod \"default-snmp-webhook-6856cfb745-mzmt2\" (UID: \"60dce4ee-cdb6-4418-ac73-063f48c8be7e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.221684 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h78v\" (UniqueName: \"kubernetes.io/projected/60dce4ee-cdb6-4418-ac73-063f48c8be7e-kube-api-access-2h78v\") pod \"default-snmp-webhook-6856cfb745-mzmt2\" (UID: \"60dce4ee-cdb6-4418-ac73-063f48c8be7e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.242669 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h78v\" (UniqueName: \"kubernetes.io/projected/60dce4ee-cdb6-4418-ac73-063f48c8be7e-kube-api-access-2h78v\") pod \"default-snmp-webhook-6856cfb745-mzmt2\" (UID: \"60dce4ee-cdb6-4418-ac73-063f48c8be7e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.325888 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.754311 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-mzmt2"] Jan 27 00:33:56 crc kubenswrapper[4774]: I0127 00:33:56.783814 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" event={"ID":"60dce4ee-cdb6-4418-ac73-063f48c8be7e","Type":"ContainerStarted","Data":"cc4113bd613c5a9940e3f7218e5d5ed472db1ff986ad6ac7d7cdf735ebd0b61e"} Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.809559 4774 generic.go:334] "Generic (PLEG): container finished" podID="71259922-2e93-4571-80c6-e054f4372056" containerID="a1008d19697279cbd14e1da441adaa8c511db421d870ec17dd82ad24769c432c" exitCode=0 Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.809681 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerDied","Data":"a1008d19697279cbd14e1da441adaa8c511db421d870ec17dd82ad24769c432c"} Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.992603 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.994700 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Jan 27 00:33:59 crc kubenswrapper[4774]: I0127 00:33:59.997182 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001177 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001489 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-c9qmn" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001616 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001841 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.001895 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.016005 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.190475 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.190801 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191012 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191140 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-tls-assets\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191284 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-volume\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191387 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.191999 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-out\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.192144 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-web-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.192270 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbq4\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-kube-api-access-dkbq4\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.293878 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294154 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294250 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294326 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-tls-assets\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: E0127 00:34:00.294353 4774 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294411 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-volume\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: E0127 00:34:00.294512 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls podName:695367d2-8c58-4bee-adcc-61c6d1b7457b nodeName:}" failed. No retries permitted until 2026-01-27 00:34:00.794484428 +0000 UTC m=+1619.100261312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "695367d2-8c58-4bee-adcc-61c6d1b7457b") : secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294545 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294575 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-out\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294618 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-web-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.294659 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbq4\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-kube-api-access-dkbq4\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.299590 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.299625 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1b199c4d1d90e45fd220b5f013926ce2c98f8abb2ae028e59739279000671a16/globalmount\"" pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.299989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-tls-assets\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.300701 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.303223 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.304036 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-web-config\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.304140 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-out\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.315798 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbq4\" (UniqueName: \"kubernetes.io/projected/695367d2-8c58-4bee-adcc-61c6d1b7457b-kube-api-access-dkbq4\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.319451 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-config-volume\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.335467 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-703cac3f-95be-42e6-9c53-fa72eaf5caa2\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: I0127 00:34:00.801306 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:00 crc kubenswrapper[4774]: E0127 00:34:00.801472 4774 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:00 crc kubenswrapper[4774]: E0127 00:34:00.801525 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls podName:695367d2-8c58-4bee-adcc-61c6d1b7457b nodeName:}" failed. No retries permitted until 2026-01-27 00:34:01.801509227 +0000 UTC m=+1620.107286112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "695367d2-8c58-4bee-adcc-61c6d1b7457b") : secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:01 crc kubenswrapper[4774]: I0127 00:34:01.815666 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:01 crc kubenswrapper[4774]: E0127 00:34:01.815882 4774 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:01 crc kubenswrapper[4774]: E0127 00:34:01.816159 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls podName:695367d2-8c58-4bee-adcc-61c6d1b7457b nodeName:}" failed. No retries permitted until 2026-01-27 00:34:03.816138722 +0000 UTC m=+1622.121915606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "695367d2-8c58-4bee-adcc-61c6d1b7457b") : secret "default-alertmanager-proxy-tls" not found Jan 27 00:34:03 crc kubenswrapper[4774]: I0127 00:34:03.847761 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:03 crc kubenswrapper[4774]: I0127 00:34:03.855649 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/695367d2-8c58-4bee-adcc-61c6d1b7457b-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"695367d2-8c58-4bee-adcc-61c6d1b7457b\") " pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:03 crc kubenswrapper[4774]: I0127 00:34:03.923919 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-c9qmn" Jan 27 00:34:03 crc kubenswrapper[4774]: I0127 00:34:03.932460 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.356432 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:04 crc kubenswrapper[4774]: E0127 00:34:04.360678 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.488459 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.848559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"5f3f65fbf4bd90f241d356ae013fbe904b385f8470eba91cf73287baff00adde"} Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.853652 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" event={"ID":"60dce4ee-cdb6-4418-ac73-063f48c8be7e","Type":"ContainerStarted","Data":"f67e26410ab3b6fdd6b6d63799b1254d45dd36e56de03df11d5c10f3ec9135fb"} Jan 27 00:34:04 crc kubenswrapper[4774]: I0127 00:34:04.879846 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-mzmt2" podStartSLOduration=2.417712053 podStartE2EDuration="9.879818456s" podCreationTimestamp="2026-01-27 00:33:55 +0000 UTC" firstStartedPulling="2026-01-27 00:33:56.765200369 +0000 UTC m=+1615.070977253" lastFinishedPulling="2026-01-27 00:34:04.227306772 +0000 UTC m=+1622.533083656" observedRunningTime="2026-01-27 00:34:04.874044491 +0000 UTC m=+1623.179821375" watchObservedRunningTime="2026-01-27 00:34:04.879818456 +0000 UTC m=+1623.185595350" Jan 27 00:34:06 crc kubenswrapper[4774]: I0127 00:34:06.870188 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"810749b3c3ce02665b129ee8d84b5fefda2c703314683991363a3f30a39bc9ca"} Jan 27 00:34:08 crc kubenswrapper[4774]: I0127 00:34:08.891226 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"061e1a8c11e7384319c73f1ff4dd7449660819bd8e6d3e217e71ab859164fe4b"} Jan 27 00:34:11 crc kubenswrapper[4774]: I0127 00:34:11.921399 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"9206af83fa1defd5137276c13b171505de9e9b0669078414806578e2b80ea754"} Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.371118 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr"] Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.372933 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.411044 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-fv29r" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.411201 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.418303 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.418798 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr"] Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.419589 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512033 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/221c6d11-7462-4578-bbb1-a78ee6bad7c0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512141 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512176 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512360 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrwsx\" (UniqueName: \"kubernetes.io/projected/221c6d11-7462-4578-bbb1-a78ee6bad7c0-kube-api-access-mrwsx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.512393 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/221c6d11-7462-4578-bbb1-a78ee6bad7c0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614269 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/221c6d11-7462-4578-bbb1-a78ee6bad7c0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614388 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614450 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614539 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrwsx\" (UniqueName: \"kubernetes.io/projected/221c6d11-7462-4578-bbb1-a78ee6bad7c0-kube-api-access-mrwsx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.614585 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/221c6d11-7462-4578-bbb1-a78ee6bad7c0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: E0127 00:34:13.615146 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.615160 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/221c6d11-7462-4578-bbb1-a78ee6bad7c0-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: E0127 00:34:13.615260 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls podName:221c6d11-7462-4578-bbb1-a78ee6bad7c0 nodeName:}" failed. No retries permitted until 2026-01-27 00:34:14.11523156 +0000 UTC m=+1632.421008444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" (UID: "221c6d11-7462-4578-bbb1-a78ee6bad7c0") : secret "default-cloud1-coll-meter-proxy-tls" not found Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.616366 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/221c6d11-7462-4578-bbb1-a78ee6bad7c0-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.621559 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:13 crc kubenswrapper[4774]: I0127 00:34:13.632883 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrwsx\" (UniqueName: \"kubernetes.io/projected/221c6d11-7462-4578-bbb1-a78ee6bad7c0-kube-api-access-mrwsx\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:14 crc kubenswrapper[4774]: I0127 00:34:14.122823 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:14 crc kubenswrapper[4774]: E0127 00:34:14.123181 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Jan 27 00:34:14 crc kubenswrapper[4774]: E0127 00:34:14.123398 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls podName:221c6d11-7462-4578-bbb1-a78ee6bad7c0 nodeName:}" failed. No retries permitted until 2026-01-27 00:34:15.123350638 +0000 UTC m=+1633.429127572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" (UID: "221c6d11-7462-4578-bbb1-a78ee6bad7c0") : secret "default-cloud1-coll-meter-proxy-tls" not found Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.159593 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.166071 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/221c6d11-7462-4578-bbb1-a78ee6bad7c0-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr\" (UID: \"221c6d11-7462-4578-bbb1-a78ee6bad7c0\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.226771 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.477381 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr"] Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.850415 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9"] Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.852381 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.854558 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.868247 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9"] Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.854955 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977190 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977246 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2rpf\" (UniqueName: \"kubernetes.io/projected/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-kube-api-access-l2rpf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977282 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977315 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.977337 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.979748 4774 generic.go:334] "Generic (PLEG): container finished" podID="695367d2-8c58-4bee-adcc-61c6d1b7457b" containerID="810749b3c3ce02665b129ee8d84b5fefda2c703314683991363a3f30a39bc9ca" exitCode=0 Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.979885 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerDied","Data":"810749b3c3ce02665b129ee8d84b5fefda2c703314683991363a3f30a39bc9ca"} Jan 27 00:34:15 crc kubenswrapper[4774]: I0127 00:34:15.983352 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"cc54d308655542a0ddec060a01b614ea149c897981047fdde83598481e298a82"} Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079020 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079395 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079421 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: E0127 00:34:16.079240 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079504 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: E0127 00:34:16.079553 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls podName:d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6 nodeName:}" failed. No retries permitted until 2026-01-27 00:34:16.579522845 +0000 UTC m=+1634.885299729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" (UID: "d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6") : secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.079589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2rpf\" (UniqueName: \"kubernetes.io/projected/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-kube-api-access-l2rpf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.080177 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.080267 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.088033 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.108341 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2rpf\" (UniqueName: \"kubernetes.io/projected/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-kube-api-access-l2rpf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: I0127 00:34:16.588110 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:16 crc kubenswrapper[4774]: E0127 00:34:16.588463 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 27 00:34:16 crc kubenswrapper[4774]: E0127 00:34:16.588623 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls podName:d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6 nodeName:}" failed. No retries permitted until 2026-01-27 00:34:17.588596429 +0000 UTC m=+1635.894373313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" (UID: "d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6") : secret "default-cloud1-ceil-meter-proxy-tls" not found Jan 27 00:34:17 crc kubenswrapper[4774]: I0127 00:34:17.605111 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:17 crc kubenswrapper[4774]: I0127 00:34:17.618411 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9\" (UID: \"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:17 crc kubenswrapper[4774]: I0127 00:34:17.739907 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.357391 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:19 crc kubenswrapper[4774]: E0127 00:34:19.357937 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.773880 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g"] Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.775989 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.778423 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.779231 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.787826 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g"] Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954000 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/cce880cf-c820-49a2-9cf3-c2161499d51f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954054 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvrs\" (UniqueName: \"kubernetes.io/projected/cce880cf-c820-49a2-9cf3-c2161499d51f-kube-api-access-wkvrs\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954115 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954242 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:19 crc kubenswrapper[4774]: I0127 00:34:19.954427 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/cce880cf-c820-49a2-9cf3-c2161499d51f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055592 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/cce880cf-c820-49a2-9cf3-c2161499d51f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055679 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/cce880cf-c820-49a2-9cf3-c2161499d51f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055699 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvrs\" (UniqueName: \"kubernetes.io/projected/cce880cf-c820-49a2-9cf3-c2161499d51f-kube-api-access-wkvrs\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055751 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.055779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: E0127 00:34:20.056182 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.056276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/cce880cf-c820-49a2-9cf3-c2161499d51f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: E0127 00:34:20.056313 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls podName:cce880cf-c820-49a2-9cf3-c2161499d51f nodeName:}" failed. No retries permitted until 2026-01-27 00:34:20.556267677 +0000 UTC m=+1638.862044561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" (UID: "cce880cf-c820-49a2-9cf3-c2161499d51f") : secret "default-cloud1-sens-meter-proxy-tls" not found Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.056462 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/cce880cf-c820-49a2-9cf3-c2161499d51f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.064605 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.074335 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvrs\" (UniqueName: \"kubernetes.io/projected/cce880cf-c820-49a2-9cf3-c2161499d51f-kube-api-access-wkvrs\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: I0127 00:34:20.562949 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:20 crc kubenswrapper[4774]: E0127 00:34:20.563172 4774 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Jan 27 00:34:20 crc kubenswrapper[4774]: E0127 00:34:20.563262 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls podName:cce880cf-c820-49a2-9cf3-c2161499d51f nodeName:}" failed. No retries permitted until 2026-01-27 00:34:21.563243385 +0000 UTC m=+1639.869020269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" (UID: "cce880cf-c820-49a2-9cf3-c2161499d51f") : secret "default-cloud1-sens-meter-proxy-tls" not found Jan 27 00:34:21 crc kubenswrapper[4774]: I0127 00:34:21.623164 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:21 crc kubenswrapper[4774]: I0127 00:34:21.629720 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/cce880cf-c820-49a2-9cf3-c2161499d51f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g\" (UID: \"cce880cf-c820-49a2-9cf3-c2161499d51f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:21 crc kubenswrapper[4774]: I0127 00:34:21.900166 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" Jan 27 00:34:23 crc kubenswrapper[4774]: I0127 00:34:23.852097 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g"] Jan 27 00:34:23 crc kubenswrapper[4774]: I0127 00:34:23.858584 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9"] Jan 27 00:34:23 crc kubenswrapper[4774]: W0127 00:34:23.951814 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd82a9cf9_0ba7_416d_93b4_b3bd9bc6a9f6.slice/crio-0dd122556523b6cbccabccbebb66c5d2524883b8236dc80d8eef8d82384996a1 WatchSource:0}: Error finding container 0dd122556523b6cbccabccbebb66c5d2524883b8236dc80d8eef8d82384996a1: Status 404 returned error can't find the container with id 0dd122556523b6cbccabccbebb66c5d2524883b8236dc80d8eef8d82384996a1 Jan 27 00:34:23 crc kubenswrapper[4774]: W0127 00:34:23.954102 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce880cf_c820_49a2_9cf3_c2161499d51f.slice/crio-335c6ca030d87267671be761741da47811afd04cca8b97b22ed6048bc2527041 WatchSource:0}: Error finding container 335c6ca030d87267671be761741da47811afd04cca8b97b22ed6048bc2527041: Status 404 returned error can't find the container with id 335c6ca030d87267671be761741da47811afd04cca8b97b22ed6048bc2527041 Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.053046 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"0dd122556523b6cbccabccbebb66c5d2524883b8236dc80d8eef8d82384996a1"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.058902 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"68ad807559137d36959c1aa7090218a544b931c8a28f63f731d68957948ac5c5"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.060247 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"335c6ca030d87267671be761741da47811afd04cca8b97b22ed6048bc2527041"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.062449 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"2acf1a5b21522993031d7a31e0b33469eb0fa0336dd2cd0bf363d142d45721ee"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.065182 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"71259922-2e93-4571-80c6-e054f4372056","Type":"ContainerStarted","Data":"fa6b17978110c0c566da456324f7019ceb843ea7b8ee0b84d9507e78d46fce1b"} Jan 27 00:34:24 crc kubenswrapper[4774]: I0127 00:34:24.093628 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.013662005 podStartE2EDuration="40.093599956s" podCreationTimestamp="2026-01-27 00:33:44 +0000 UTC" firstStartedPulling="2026-01-27 00:33:48.285173991 +0000 UTC m=+1606.590950875" lastFinishedPulling="2026-01-27 00:34:23.365111942 +0000 UTC m=+1641.670888826" observedRunningTime="2026-01-27 00:34:24.089139716 +0000 UTC m=+1642.394916620" watchObservedRunningTime="2026-01-27 00:34:24.093599956 +0000 UTC m=+1642.399376840" Jan 27 00:34:25 crc kubenswrapper[4774]: I0127 00:34:25.101605 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"e86719840d53227a371646ced201bfdda0d466ae22ffba039461546e413783bc"} Jan 27 00:34:25 crc kubenswrapper[4774]: I0127 00:34:25.106006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f"} Jan 27 00:34:26 crc kubenswrapper[4774]: I0127 00:34:26.118736 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"1d5cacd63e8a2f8cf24e618cade0bc5f4b296dad05817cd41c06295a2aba6d65"} Jan 27 00:34:26 crc kubenswrapper[4774]: I0127 00:34:26.122260 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15"} Jan 27 00:34:26 crc kubenswrapper[4774]: I0127 00:34:26.125637 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb"} Jan 27 00:34:26 crc kubenswrapper[4774]: I0127 00:34:26.125699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"79fc62c1835dfc91d81c5d2ece1e2569c8514a410294d2bda05be96d3ba44d6f"} Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.652302 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8"] Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.653758 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.656973 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.677056 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.678301 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8"] Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.742151 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7jpz\" (UniqueName: \"kubernetes.io/projected/40a1ab63-3b75-4f20-a692-58d80ccd2847-kube-api-access-h7jpz\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.742228 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/40a1ab63-3b75-4f20-a692-58d80ccd2847-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.742262 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/40a1ab63-3b75-4f20-a692-58d80ccd2847-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.742374 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/40a1ab63-3b75-4f20-a692-58d80ccd2847-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.844263 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/40a1ab63-3b75-4f20-a692-58d80ccd2847-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.844330 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/40a1ab63-3b75-4f20-a692-58d80ccd2847-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.844422 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/40a1ab63-3b75-4f20-a692-58d80ccd2847-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.844483 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7jpz\" (UniqueName: \"kubernetes.io/projected/40a1ab63-3b75-4f20-a692-58d80ccd2847-kube-api-access-h7jpz\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.845429 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/40a1ab63-3b75-4f20-a692-58d80ccd2847-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.845760 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/40a1ab63-3b75-4f20-a692-58d80ccd2847-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.853507 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/40a1ab63-3b75-4f20-a692-58d80ccd2847-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.867888 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7jpz\" (UniqueName: \"kubernetes.io/projected/40a1ab63-3b75-4f20-a692-58d80ccd2847-kube-api-access-h7jpz\") pod \"default-cloud1-coll-event-smartgateway-564958f777-cn8q8\" (UID: \"40a1ab63-3b75-4f20-a692-58d80ccd2847\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.955660 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Jan 27 00:34:27 crc kubenswrapper[4774]: I0127 00:34:27.978085 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.585624 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp"] Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.586910 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.590327 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.609028 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp"] Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.655740 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvd77\" (UniqueName: \"kubernetes.io/projected/78320a5f-ee7d-4190-8af3-9d9609bcf111-kube-api-access-rvd77\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.655878 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/78320a5f-ee7d-4190-8af3-9d9609bcf111-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.655917 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/78320a5f-ee7d-4190-8af3-9d9609bcf111-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.656015 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/78320a5f-ee7d-4190-8af3-9d9609bcf111-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.758751 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvd77\" (UniqueName: \"kubernetes.io/projected/78320a5f-ee7d-4190-8af3-9d9609bcf111-kube-api-access-rvd77\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.758895 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/78320a5f-ee7d-4190-8af3-9d9609bcf111-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.758944 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/78320a5f-ee7d-4190-8af3-9d9609bcf111-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.759008 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/78320a5f-ee7d-4190-8af3-9d9609bcf111-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.760157 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/78320a5f-ee7d-4190-8af3-9d9609bcf111-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.760539 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/78320a5f-ee7d-4190-8af3-9d9609bcf111-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.782105 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/78320a5f-ee7d-4190-8af3-9d9609bcf111-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.786894 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvd77\" (UniqueName: \"kubernetes.io/projected/78320a5f-ee7d-4190-8af3-9d9609bcf111-kube-api-access-rvd77\") pod \"default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp\" (UID: \"78320a5f-ee7d-4190-8af3-9d9609bcf111\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:28 crc kubenswrapper[4774]: I0127 00:34:28.909212 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" Jan 27 00:34:29 crc kubenswrapper[4774]: I0127 00:34:29.733927 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8"] Jan 27 00:34:29 crc kubenswrapper[4774]: W0127 00:34:29.736896 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40a1ab63_3b75_4f20_a692_58d80ccd2847.slice/crio-35c64f467eba82058b15d5181ef6d1a8e2bdce8df5cf4eacf8f177bfd0c6b3f0 WatchSource:0}: Error finding container 35c64f467eba82058b15d5181ef6d1a8e2bdce8df5cf4eacf8f177bfd0c6b3f0: Status 404 returned error can't find the container with id 35c64f467eba82058b15d5181ef6d1a8e2bdce8df5cf4eacf8f177bfd0c6b3f0 Jan 27 00:34:29 crc kubenswrapper[4774]: I0127 00:34:29.797115 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp"] Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.161350 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.161403 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"35c64f467eba82058b15d5181ef6d1a8e2bdce8df5cf4eacf8f177bfd0c6b3f0"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.163953 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"723dcc4f289a2404c75f71d52ad8aba594be567de1c9fa4614b1de0a72fb3740"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.174130 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"695367d2-8c58-4bee-adcc-61c6d1b7457b","Type":"ContainerStarted","Data":"c960e0c21bcfc444a274889c5d7a1571cb17081511a03b672c22fd296c6b3d7e"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.190829 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"4c760483fbd2a1a7acf7ac960a78224bab5e8dc9e4bb33d2f80641b2bd1befaa"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.193083 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" podStartSLOduration=10.046081442 podStartE2EDuration="15.193052716s" podCreationTimestamp="2026-01-27 00:34:15 +0000 UTC" firstStartedPulling="2026-01-27 00:34:24.312625435 +0000 UTC m=+1642.618402319" lastFinishedPulling="2026-01-27 00:34:29.459596699 +0000 UTC m=+1647.765373593" observedRunningTime="2026-01-27 00:34:30.184996139 +0000 UTC m=+1648.490773043" watchObservedRunningTime="2026-01-27 00:34:30.193052716 +0000 UTC m=+1648.498829600" Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.219188 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"0682319276b5bd880656870f2652514e216a14cbef4bb9526e80e68c3ca94bad"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.230197 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.230255 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"02c4ce06168db4f170d105ebb8fa3c2ff2b4beb60ee2ef582f3a1e8457ac7cca"} Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.234677 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=18.705891366 podStartE2EDuration="32.23458495s" podCreationTimestamp="2026-01-27 00:33:58 +0000 UTC" firstStartedPulling="2026-01-27 00:34:15.982630203 +0000 UTC m=+1634.288407087" lastFinishedPulling="2026-01-27 00:34:29.511323787 +0000 UTC m=+1647.817100671" observedRunningTime="2026-01-27 00:34:30.218365845 +0000 UTC m=+1648.524142729" watchObservedRunningTime="2026-01-27 00:34:30.23458495 +0000 UTC m=+1648.540361834" Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.269199 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" podStartSLOduration=6.071436872 podStartE2EDuration="11.269168219s" podCreationTimestamp="2026-01-27 00:34:19 +0000 UTC" firstStartedPulling="2026-01-27 00:34:24.31208847 +0000 UTC m=+1642.617865364" lastFinishedPulling="2026-01-27 00:34:29.509819817 +0000 UTC m=+1647.815596711" observedRunningTime="2026-01-27 00:34:30.247128577 +0000 UTC m=+1648.552905451" watchObservedRunningTime="2026-01-27 00:34:30.269168219 +0000 UTC m=+1648.574945103" Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.269755 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" podStartSLOduration=3.253443765 podStartE2EDuration="17.269746735s" podCreationTimestamp="2026-01-27 00:34:13 +0000 UTC" firstStartedPulling="2026-01-27 00:34:15.491542403 +0000 UTC m=+1633.797319277" lastFinishedPulling="2026-01-27 00:34:29.507845363 +0000 UTC m=+1647.813622247" observedRunningTime="2026-01-27 00:34:30.267617637 +0000 UTC m=+1648.573394541" watchObservedRunningTime="2026-01-27 00:34:30.269746735 +0000 UTC m=+1648.575523609" Jan 27 00:34:30 crc kubenswrapper[4774]: I0127 00:34:30.361827 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:30 crc kubenswrapper[4774]: E0127 00:34:30.363374 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:31 crc kubenswrapper[4774]: I0127 00:34:31.238844 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"3ecdf66505bc29c0f15176c86ae97d1090bd079768c2c1e5209c055cae244ec2"} Jan 27 00:34:31 crc kubenswrapper[4774]: I0127 00:34:31.242021 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"8c54909466989ae687c06c9a207a5ec93894db780381f2df64a8096664a167cf"} Jan 27 00:34:31 crc kubenswrapper[4774]: I0127 00:34:31.289268 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" podStartSLOduration=2.943343535 podStartE2EDuration="3.28924346s" podCreationTimestamp="2026-01-27 00:34:28 +0000 UTC" firstStartedPulling="2026-01-27 00:34:29.827439672 +0000 UTC m=+1648.133216566" lastFinishedPulling="2026-01-27 00:34:30.173339587 +0000 UTC m=+1648.479116491" observedRunningTime="2026-01-27 00:34:31.287688268 +0000 UTC m=+1649.593465152" watchObservedRunningTime="2026-01-27 00:34:31.28924346 +0000 UTC m=+1649.595020344" Jan 27 00:34:31 crc kubenswrapper[4774]: I0127 00:34:31.294308 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" podStartSLOduration=3.8625214850000003 podStartE2EDuration="4.294297355s" podCreationTimestamp="2026-01-27 00:34:27 +0000 UTC" firstStartedPulling="2026-01-27 00:34:29.744517296 +0000 UTC m=+1648.050294180" lastFinishedPulling="2026-01-27 00:34:30.176293156 +0000 UTC m=+1648.482070050" observedRunningTime="2026-01-27 00:34:31.26320003 +0000 UTC m=+1649.568976914" watchObservedRunningTime="2026-01-27 00:34:31.294297355 +0000 UTC m=+1649.600074239" Jan 27 00:34:32 crc kubenswrapper[4774]: I0127 00:34:32.955876 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Jan 27 00:34:33 crc kubenswrapper[4774]: I0127 00:34:33.017996 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Jan 27 00:34:33 crc kubenswrapper[4774]: I0127 00:34:33.320647 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Jan 27 00:34:41 crc kubenswrapper[4774]: I0127 00:34:41.485975 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:34:41 crc kubenswrapper[4774]: I0127 00:34:41.486912 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerName="default-interconnect" containerID="cri-o://ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" gracePeriod=30 Jan 27 00:34:41 crc kubenswrapper[4774]: I0127 00:34:41.890611 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.004588 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005188 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005265 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005348 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005383 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005415 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.005439 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") pod \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\" (UID: \"2c8f3edd-38d2-4a50-add2-873dd1ac35e5\") " Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.007003 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.012803 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.013242 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w" (OuterVolumeSpecName: "kube-api-access-qdh7w") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "kube-api-access-qdh7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.013372 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.014063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.014979 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.029121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "2c8f3edd-38d2-4a50-add2-873dd1ac35e5" (UID: "2c8f3edd-38d2-4a50-add2-873dd1ac35e5"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.108949 4774 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109021 4774 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109050 4774 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109074 4774 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-sasl-users\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109094 4774 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109114 4774 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.109137 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdh7w\" (UniqueName: \"kubernetes.io/projected/2c8f3edd-38d2-4a50-add2-873dd1ac35e5-kube-api-access-qdh7w\") on node \"crc\" DevicePath \"\"" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.348619 4774 generic.go:334] "Generic (PLEG): container finished" podID="d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6" containerID="4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.349205 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerDied","Data":"4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.350763 4774 generic.go:334] "Generic (PLEG): container finished" podID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerID="ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.350839 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.350828 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" event={"ID":"2c8f3edd-38d2-4a50-add2-873dd1ac35e5","Type":"ContainerDied","Data":"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.350979 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-b5bgt" event={"ID":"2c8f3edd-38d2-4a50-add2-873dd1ac35e5","Type":"ContainerDied","Data":"a5ba257ff6bd2d15abb20ec26902d31e3cb2085f1429dfba58273e45e14cd88c"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.351013 4774 scope.go:117] "RemoveContainer" containerID="ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.353990 4774 scope.go:117] "RemoveContainer" containerID="4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.354060 4774 generic.go:334] "Generic (PLEG): container finished" podID="221c6d11-7462-4578-bbb1-a78ee6bad7c0" containerID="a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.354129 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerDied","Data":"a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.355025 4774 scope.go:117] "RemoveContainer" containerID="a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.361493 4774 generic.go:334] "Generic (PLEG): container finished" podID="78320a5f-ee7d-4190-8af3-9d9609bcf111" containerID="74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.373665 4774 generic.go:334] "Generic (PLEG): container finished" podID="40a1ab63-3b75-4f20-a692-58d80ccd2847" containerID="58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c" exitCode=0 Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.384244 4774 scope.go:117] "RemoveContainer" containerID="ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" Jan 27 00:34:42 crc kubenswrapper[4774]: E0127 00:34:42.390423 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f\": container with ID starting with ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f not found: ID does not exist" containerID="ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.390494 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f"} err="failed to get container status \"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f\": rpc error: code = NotFound desc = could not find container \"ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f\": container with ID starting with ac9cabb6d85b525fd7a9388a9c68cb8444b56f66b9ac6be1503e2a843650111f not found: ID does not exist" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.415261 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerDied","Data":"74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.415346 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerDied","Data":"58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c"} Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.416147 4774 scope.go:117] "RemoveContainer" containerID="58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.423197 4774 scope.go:117] "RemoveContainer" containerID="74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f" Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.493028 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:34:42 crc kubenswrapper[4774]: I0127 00:34:42.501134 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-b5bgt"] Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.086413 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rjwl8"] Jan 27 00:34:43 crc kubenswrapper[4774]: E0127 00:34:43.087814 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerName="default-interconnect" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.087952 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerName="default-interconnect" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.088218 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" containerName="default-interconnect" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.088991 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.091851 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.092269 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.092468 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.092600 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.092723 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-cdfh6" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.093433 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.093822 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.110529 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rjwl8"] Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148265 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dq2z\" (UniqueName: \"kubernetes.io/projected/2480c4b1-c406-412e-9258-a94358f2c1c1-kube-api-access-6dq2z\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148342 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148374 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-users\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148417 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148438 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148461 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-config\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.148487 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249483 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249550 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-users\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249614 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249639 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-config\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249671 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.249708 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dq2z\" (UniqueName: \"kubernetes.io/projected/2480c4b1-c406-412e-9258-a94358f2c1c1-kube-api-access-6dq2z\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.251894 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-config\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.263625 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-sasl-users\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.264832 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.264879 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.264891 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.268760 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/2480c4b1-c406-412e-9258-a94358f2c1c1-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.270581 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dq2z\" (UniqueName: \"kubernetes.io/projected/2480c4b1-c406-412e-9258-a94358f2c1c1-kube-api-access-6dq2z\") pod \"default-interconnect-68864d46cb-rjwl8\" (UID: \"2480c4b1-c406-412e-9258-a94358f2c1c1\") " pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.384601 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.387774 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.390556 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.393931 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.397127 4774 generic.go:334] "Generic (PLEG): container finished" podID="cce880cf-c820-49a2-9cf3-c2161499d51f" containerID="2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15" exitCode=0 Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.397224 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerDied","Data":"2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15"} Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.399400 4774 scope.go:117] "RemoveContainer" containerID="2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.406430 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" Jan 27 00:34:43 crc kubenswrapper[4774]: I0127 00:34:43.782896 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-rjwl8"] Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.033925 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.034904 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.038429 4774 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.038544 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.043282 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.171341 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/fe039606-88c5-414d-b20c-a129a4bb0782-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.171400 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/fe039606-88c5-414d-b20c-a129a4bb0782-qdr-test-config\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.171423 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkm2p\" (UniqueName: \"kubernetes.io/projected/fe039606-88c5-414d-b20c-a129a4bb0782-kube-api-access-qkm2p\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.272657 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/fe039606-88c5-414d-b20c-a129a4bb0782-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.272749 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkm2p\" (UniqueName: \"kubernetes.io/projected/fe039606-88c5-414d-b20c-a129a4bb0782-kube-api-access-qkm2p\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.272773 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/fe039606-88c5-414d-b20c-a129a4bb0782-qdr-test-config\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.273737 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/fe039606-88c5-414d-b20c-a129a4bb0782-qdr-test-config\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.280418 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/fe039606-88c5-414d-b20c-a129a4bb0782-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.293526 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkm2p\" (UniqueName: \"kubernetes.io/projected/fe039606-88c5-414d-b20c-a129a4bb0782-kube-api-access-qkm2p\") pod \"qdr-test\" (UID: \"fe039606-88c5-414d-b20c-a129a4bb0782\") " pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.366491 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8f3edd-38d2-4a50-add2-873dd1ac35e5" path="/var/lib/kubelet/pods/2c8f3edd-38d2-4a50-add2-873dd1ac35e5/volumes" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.368658 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.415161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" event={"ID":"2480c4b1-c406-412e-9258-a94358f2c1c1","Type":"ContainerStarted","Data":"19d79350a136be565c6857d0a87082d9b9549bc42dffec45f34fa0037b1f5da4"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.415219 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" event={"ID":"2480c4b1-c406-412e-9258-a94358f2c1c1","Type":"ContainerStarted","Data":"1ca32d8536934458b3214a3be428cba7ebec47814fa35d583e204f1dd4a4955e"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.421410 4774 generic.go:334] "Generic (PLEG): container finished" podID="d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6" containerID="b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf" exitCode=0 Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.421482 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerDied","Data":"b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.421524 4774 scope.go:117] "RemoveContainer" containerID="4f37ba1566a4b796836eee776bea95eb9e436293729ba46e3dcd058ce82cd3bb" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.422085 4774 scope.go:117] "RemoveContainer" containerID="b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf" Jan 27 00:34:44 crc kubenswrapper[4774]: E0127 00:34:44.422280 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9_service-telemetry(d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" podUID="d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.426911 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.443697 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-rjwl8" podStartSLOduration=3.443673326 podStartE2EDuration="3.443673326s" podCreationTimestamp="2026-01-27 00:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 00:34:44.436353339 +0000 UTC m=+1662.742130223" watchObservedRunningTime="2026-01-27 00:34:44.443673326 +0000 UTC m=+1662.749450210" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.446012 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerDied","Data":"d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.446026 4774 generic.go:334] "Generic (PLEG): container finished" podID="221c6d11-7462-4578-bbb1-a78ee6bad7c0" containerID="d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9" exitCode=0 Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.446780 4774 scope.go:117] "RemoveContainer" containerID="d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9" Jan 27 00:34:44 crc kubenswrapper[4774]: E0127 00:34:44.447166 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr_service-telemetry(221c6d11-7462-4578-bbb1-a78ee6bad7c0)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" podUID="221c6d11-7462-4578-bbb1-a78ee6bad7c0" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.481441 4774 generic.go:334] "Generic (PLEG): container finished" podID="78320a5f-ee7d-4190-8af3-9d9609bcf111" containerID="3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c" exitCode=0 Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.481484 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerDied","Data":"3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.484011 4774 scope.go:117] "RemoveContainer" containerID="3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c" Jan 27 00:34:44 crc kubenswrapper[4774]: E0127 00:34:44.484258 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp_service-telemetry(78320a5f-ee7d-4190-8af3-9d9609bcf111)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" podUID="78320a5f-ee7d-4190-8af3-9d9609bcf111" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.496329 4774 scope.go:117] "RemoveContainer" containerID="a7d5fa1cc4b7c3203e38d581c1fee7da028e28e96cebdb25bcb5dca2d68d5a4f" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.504352 4774 generic.go:334] "Generic (PLEG): container finished" podID="40a1ab63-3b75-4f20-a692-58d80ccd2847" containerID="de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30" exitCode=0 Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.504415 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerDied","Data":"de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30"} Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.505134 4774 scope.go:117] "RemoveContainer" containerID="de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30" Jan 27 00:34:44 crc kubenswrapper[4774]: E0127 00:34:44.505386 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-564958f777-cn8q8_service-telemetry(40a1ab63-3b75-4f20-a692-58d80ccd2847)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" podUID="40a1ab63-3b75-4f20-a692-58d80ccd2847" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.538715 4774 scope.go:117] "RemoveContainer" containerID="74df0230fe8ea32d4f70de22858c8d9d25c4639a075ffac3c84a13913df6185f" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.607028 4774 scope.go:117] "RemoveContainer" containerID="58f83e6debebc5843c00cb7eeae5c528ab9844c886afeaef42602c15951a040c" Jan 27 00:34:44 crc kubenswrapper[4774]: I0127 00:34:44.911967 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Jan 27 00:34:44 crc kubenswrapper[4774]: W0127 00:34:44.916136 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe039606_88c5_414d_b20c_a129a4bb0782.slice/crio-a4a8034afc6ceb9593b356b70741af03ed4076db1700b116b3f070c4e2dbd4cc WatchSource:0}: Error finding container a4a8034afc6ceb9593b356b70741af03ed4076db1700b116b3f070c4e2dbd4cc: Status 404 returned error can't find the container with id a4a8034afc6ceb9593b356b70741af03ed4076db1700b116b3f070c4e2dbd4cc Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.356417 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:45 crc kubenswrapper[4774]: E0127 00:34:45.356883 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.535920 4774 generic.go:334] "Generic (PLEG): container finished" podID="cce880cf-c820-49a2-9cf3-c2161499d51f" containerID="eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745" exitCode=0 Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.536431 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerDied","Data":"eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745"} Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.536484 4774 scope.go:117] "RemoveContainer" containerID="2c11176ba606ee0424f5808b875366de2c6dcfe3113b88c461b978c03af1bf15" Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.537239 4774 scope.go:117] "RemoveContainer" containerID="eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745" Jan 27 00:34:45 crc kubenswrapper[4774]: E0127 00:34:45.537515 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g_service-telemetry(cce880cf-c820-49a2-9cf3-c2161499d51f)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" podUID="cce880cf-c820-49a2-9cf3-c2161499d51f" Jan 27 00:34:45 crc kubenswrapper[4774]: I0127 00:34:45.631769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"fe039606-88c5-414d-b20c-a129a4bb0782","Type":"ContainerStarted","Data":"a4a8034afc6ceb9593b356b70741af03ed4076db1700b116b3f070c4e2dbd4cc"} Jan 27 00:34:55 crc kubenswrapper[4774]: I0127 00:34:55.358270 4774 scope.go:117] "RemoveContainer" containerID="b9898876635ec2d4af58cf5374041301bf103fd58c245f17fb5c65e927758ecf" Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.248442 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.357021 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:34:56 crc kubenswrapper[4774]: E0127 00:34:56.357281 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.357731 4774 scope.go:117] "RemoveContainer" containerID="eaa793c6854b8af24202f25f79d08e139aab8cabda8244e19d2dec680342c745" Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.763806 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g" event={"ID":"cce880cf-c820-49a2-9cf3-c2161499d51f","Type":"ContainerStarted","Data":"f8f10b196b67872bb3419b950227b872761dd478b9fdeeb86f5c83fb84943e1b"} Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.766640 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"fe039606-88c5-414d-b20c-a129a4bb0782","Type":"ContainerStarted","Data":"2e8f6a82588628633e3da4cb06eb019f4f21a9f6756ee608368c7172acae123e"} Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.771078 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9" event={"ID":"d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6","Type":"ContainerStarted","Data":"c483661aab70268feeb0ad1f4b7f9fae9b4ed2ad388c4936ecacbdc77485c2fb"} Jan 27 00:34:56 crc kubenswrapper[4774]: I0127 00:34:56.810563 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.383996196 podStartE2EDuration="13.810526763s" podCreationTimestamp="2026-01-27 00:34:43 +0000 UTC" firstStartedPulling="2026-01-27 00:34:44.91919625 +0000 UTC m=+1663.224973134" lastFinishedPulling="2026-01-27 00:34:56.345726817 +0000 UTC m=+1674.651503701" observedRunningTime="2026-01-27 00:34:56.80928134 +0000 UTC m=+1675.115058224" watchObservedRunningTime="2026-01-27 00:34:56.810526763 +0000 UTC m=+1675.116303637" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.145305 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mjtw7"] Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.146627 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.150494 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.150808 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.152643 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.153021 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.154097 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.155006 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.173667 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mjtw7"] Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240272 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240348 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240378 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240477 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240534 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240566 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.240602 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.341641 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342112 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342155 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342187 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342760 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.343202 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.343470 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344000 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.342794 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344112 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344159 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344634 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.344959 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.357190 4774 scope.go:117] "RemoveContainer" containerID="3c779b9ff0e29a6c4656f9b97f44ee83dfa3b7b8e130cc42bd59a597e9ce8b4c" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.379056 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") pod \"stf-smoketest-smoke1-mjtw7\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.461183 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.542139 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.543030 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.551997 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.656959 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") pod \"curl\" (UID: \"8b27e0f1-7f20-4c92-b279-4a642379fc41\") " pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.760790 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") pod \"curl\" (UID: \"8b27e0f1-7f20-4c92-b279-4a642379fc41\") " pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.816515 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") pod \"curl\" (UID: \"8b27e0f1-7f20-4c92-b279-4a642379fc41\") " pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.833201 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp" event={"ID":"78320a5f-ee7d-4190-8af3-9d9609bcf111","Type":"ContainerStarted","Data":"a1875f622f815691b08bc79fca524319649ee79ff21f7659b380e66e7331878b"} Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.912873 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 27 00:34:57 crc kubenswrapper[4774]: I0127 00:34:57.997406 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-mjtw7"] Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.149093 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.357026 4774 scope.go:117] "RemoveContainer" containerID="de57d00be6247b9776d90dd256bd52c864446df8475238e51b4cc4b0797c6c30" Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.358273 4774 scope.go:117] "RemoveContainer" containerID="d31f6718b5972460b7dfbd88cdf2e00be97bc26c0bfac2637e829c88a7995cf9" Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.841078 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerStarted","Data":"a4b45d7c5256711b20005d0e8d1e762fe1d54612eb9b855eb186ad80c6d704c0"} Jan 27 00:34:58 crc kubenswrapper[4774]: I0127 00:34:58.842332 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"8b27e0f1-7f20-4c92-b279-4a642379fc41","Type":"ContainerStarted","Data":"a4b4800a9182a2cede7edf0bba16cdfeda5b569ebf0489b187dc1fbb7d2dca40"} Jan 27 00:34:59 crc kubenswrapper[4774]: I0127 00:34:59.869417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr" event={"ID":"221c6d11-7462-4578-bbb1-a78ee6bad7c0","Type":"ContainerStarted","Data":"cf05f6a90b463ca3e2066dfcaf8c4a4785856be0854c2186812ee45208a925fc"} Jan 27 00:34:59 crc kubenswrapper[4774]: I0127 00:34:59.872491 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-564958f777-cn8q8" event={"ID":"40a1ab63-3b75-4f20-a692-58d80ccd2847","Type":"ContainerStarted","Data":"ea994568bce6d862e732b433ee25ed08324f97f3a871d2bfcb19e954a1bd5899"} Jan 27 00:35:00 crc kubenswrapper[4774]: I0127 00:35:00.883195 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"8b27e0f1-7f20-4c92-b279-4a642379fc41","Type":"ContainerStarted","Data":"18a2e3275099a59d98ea59bf9877de771dc72d4995929572319101ea7c183b79"} Jan 27 00:35:00 crc kubenswrapper[4774]: I0127 00:35:00.900362 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/curl" podStartSLOduration=1.399244628 podStartE2EDuration="3.900338351s" podCreationTimestamp="2026-01-27 00:34:57 +0000 UTC" firstStartedPulling="2026-01-27 00:34:58.159750039 +0000 UTC m=+1676.465526923" lastFinishedPulling="2026-01-27 00:35:00.660843752 +0000 UTC m=+1678.966620646" observedRunningTime="2026-01-27 00:35:00.894467433 +0000 UTC m=+1679.200244337" watchObservedRunningTime="2026-01-27 00:35:00.900338351 +0000 UTC m=+1679.206115235" Jan 27 00:35:01 crc kubenswrapper[4774]: I0127 00:35:01.901396 4774 generic.go:334] "Generic (PLEG): container finished" podID="8b27e0f1-7f20-4c92-b279-4a642379fc41" containerID="18a2e3275099a59d98ea59bf9877de771dc72d4995929572319101ea7c183b79" exitCode=0 Jan 27 00:35:01 crc kubenswrapper[4774]: I0127 00:35:01.901572 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"8b27e0f1-7f20-4c92-b279-4a642379fc41","Type":"ContainerDied","Data":"18a2e3275099a59d98ea59bf9877de771dc72d4995929572319101ea7c183b79"} Jan 27 00:35:11 crc kubenswrapper[4774]: I0127 00:35:11.357887 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:35:11 crc kubenswrapper[4774]: E0127 00:35:11.358422 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:35:13 crc kubenswrapper[4774]: E0127 00:35:13.495085 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/tripleomastercentos9/openstack-collectd:current-tripleo" Jan 27 00:35:13 crc kubenswrapper[4774]: E0127 00:35:13.496646 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:smoketest-collectd,Image:quay.io/tripleomastercentos9/openstack-collectd:current-tripleo,Command:[/smoketest_collectd_entrypoint.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLOUDNAME,Value:smoke1,ValueFrom:nil,},EnvVar{Name:ELASTICSEARCH_AUTH_PASS,Value:TRctxSF5yjVx7wM13DX9sDEj,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_AUTH_TOKEN,Value:eyJhbGciOiJSUzI1NiIsImtpZCI6InF6SnFxNFFjbVk5VmJQZ2dNMmUxdHFmTlJlVWx4UDhSTlhIamV3RUx4WU0ifQ.eyJhdWQiOlsiaHR0cHM6Ly9rdWJlcm5ldGVzLmRlZmF1bHQuc3ZjIl0sImV4cCI6MTc2OTQ3NzY3OSwiaWF0IjoxNzY5NDc0MDc5LCJpc3MiOiJodHRwczovL2t1YmVybmV0ZXMuZGVmYXVsdC5zdmMiLCJqdGkiOiI3Y2E2ODBiNi1iNWVhLTQyNTgtYjQwZi0xMjhkZjkyM2NhMzAiLCJrdWJlcm5ldGVzLmlvIjp7Im5hbWVzcGFjZSI6InNlcnZpY2UtdGVsZW1ldHJ5Iiwic2VydmljZWFjY291bnQiOnsibmFtZSI6InN0Zi1wcm9tZXRoZXVzLXJlYWRlciIsInVpZCI6IjVlODU2Y2JlLWRhMWUtNGFkZS05Yjg0LWE1ZWJlMzg1Njk0ZSJ9fSwibmJmIjoxNzY5NDc0MDc5LCJzdWIiOiJzeXN0ZW06c2VydmljZWFjY291bnQ6c2VydmljZS10ZWxlbWV0cnk6c3RmLXByb21ldGhldXMtcmVhZGVyIn0.ldNLwti5zqlucyXu-cOzkivSmAZvjduc3ztWpwB_lYEjhgfKJZyEQuCfg3o_RhksF0LNMiXmNWuTcDqcuiJLZhxWYeyI2Nb2Ved0AkZvOroH2Pntmd4T_iy319JZxx-34yJ7cBo1sErdxeVbr63jrYT0HVzyIi3Xpq7XNMGIwlumW8iVc8E5b_7w2DfMl-gv7gLr2GTY5PO8fE-4WKBGLkKh10GPXVxBL5ToyvXfMjM78N618MjQfmONl3hiIEY023gyLM4HK7PUWF-VBgxA4ba1NVPsQGl2dNFBf2B2xJNvT19U85NmW_gbTa5Dst41w5SJBrUEGVp42Ne1pXoqAqOnSla3caDcMugh2dHXnP4BsOV14T6PhtwcDR8aeQUibiUqGI23ONmG3JXMwR80zV9rvQvFOSCDKU8bUpaY9XIqEJjIBPZ4VLg22tMT2egqspnhl0dnO925MTESSXXgZfJiMBPhPJij4RQyPgNQzkgUJNSDoxrjpgl4h2nMN6CNhws8t8mV4uooc9qxVum1cUdccCeNjBW2stZ0gOatw1LkPtwXacIS1GMTemTVm9dCnNJNza2Ov6LM0mAgQUz6r1_atHH70pZYwHXP1W4K2oxgw3aP9-ivcnFo3GSSbrPwkuO5LTi_OJt41YU5vNMN67aUKgjo_ikedOKtqB9sn5o,ValueFrom:nil,},EnvVar{Name:OBSERVABILITY_STRATEGY,Value:<>,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:collectd-config,ReadOnly:false,MountPath:/etc/minimal-collectd.conf.template,SubPath:minimal-collectd.conf.template,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sensubility-config,ReadOnly:false,MountPath:/etc/collectd-sensubility.conf,SubPath:collectd-sensubility.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:healthcheck-log,ReadOnly:false,MountPath:/healthcheck.log,SubPath:healthcheck.log,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:collectd-entrypoint-script,ReadOnly:false,MountPath:/smoketest_collectd_entrypoint.sh,SubPath:smoketest_collectd_entrypoint.sh,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9pjvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod stf-smoketest-smoke1-mjtw7_service-telemetry(e41bad20-a43c-4561-aefa-15b6cdfb715b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.510773 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.564355 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") pod \"8b27e0f1-7f20-4c92-b279-4a642379fc41\" (UID: \"8b27e0f1-7f20-4c92-b279-4a642379fc41\") " Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.576140 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz" (OuterVolumeSpecName: "kube-api-access-fkpbz") pod "8b27e0f1-7f20-4c92-b279-4a642379fc41" (UID: "8b27e0f1-7f20-4c92-b279-4a642379fc41"). InnerVolumeSpecName "kube-api-access-fkpbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.668484 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkpbz\" (UniqueName: \"kubernetes.io/projected/8b27e0f1-7f20-4c92-b279-4a642379fc41-kube-api-access-fkpbz\") on node \"crc\" DevicePath \"\"" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.683748 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_8b27e0f1-7f20-4c92-b279-4a642379fc41/curl/0.log" Jan 27 00:35:13 crc kubenswrapper[4774]: I0127 00:35:13.974069 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-mzmt2_60dce4ee-cdb6-4418-ac73-063f48c8be7e/prometheus-webhook-snmp/0.log" Jan 27 00:35:14 crc kubenswrapper[4774]: I0127 00:35:14.009675 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"8b27e0f1-7f20-4c92-b279-4a642379fc41","Type":"ContainerDied","Data":"a4b4800a9182a2cede7edf0bba16cdfeda5b569ebf0489b187dc1fbb7d2dca40"} Jan 27 00:35:14 crc kubenswrapper[4774]: I0127 00:35:14.009716 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b4800a9182a2cede7edf0bba16cdfeda5b569ebf0489b187dc1fbb7d2dca40" Jan 27 00:35:14 crc kubenswrapper[4774]: I0127 00:35:14.009768 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Jan 27 00:35:16 crc kubenswrapper[4774]: I0127 00:35:16.484840 4774 scope.go:117] "RemoveContainer" containerID="4f2d7297d4d28dd848f59b341d688f7920adc8f241b7e1da6ad4079f86054ae7" Jan 27 00:35:17 crc kubenswrapper[4774]: I0127 00:35:17.993389 4774 scope.go:117] "RemoveContainer" containerID="1acb6f758817187337affc4bdd3cea29ad0aa6e46b274461e5eea0064a19c955" Jan 27 00:35:20 crc kubenswrapper[4774]: E0127 00:35:20.672922 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" Jan 27 00:35:21 crc kubenswrapper[4774]: I0127 00:35:21.087446 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerStarted","Data":"65e528d8b0ae9b041234c20273bd32c0fb702d531f2d23967dddb18fbfaea7f8"} Jan 27 00:35:21 crc kubenswrapper[4774]: E0127 00:35:21.088979 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tripleomastercentos9/openstack-collectd:current-tripleo\\\"\"" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" Jan 27 00:35:22 crc kubenswrapper[4774]: E0127 00:35:22.098473 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tripleomastercentos9/openstack-collectd:current-tripleo\\\"\"" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" Jan 27 00:35:25 crc kubenswrapper[4774]: I0127 00:35:25.357270 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:35:25 crc kubenswrapper[4774]: E0127 00:35:25.358066 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:35:38 crc kubenswrapper[4774]: I0127 00:35:38.255625 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerStarted","Data":"612cbdd3efc1246a6024e827473fb19920043fe52ca7f85b7878afcf61cf927f"} Jan 27 00:35:38 crc kubenswrapper[4774]: I0127 00:35:38.356875 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:35:38 crc kubenswrapper[4774]: E0127 00:35:38.357218 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:35:44 crc kubenswrapper[4774]: I0127 00:35:44.148186 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-mzmt2_60dce4ee-cdb6-4418-ac73-063f48c8be7e/prometheus-webhook-snmp/0.log" Jan 27 00:35:49 crc kubenswrapper[4774]: I0127 00:35:49.357440 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:35:49 crc kubenswrapper[4774]: E0127 00:35:49.358204 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:35:52 crc kubenswrapper[4774]: I0127 00:35:52.389377 4774 generic.go:334] "Generic (PLEG): container finished" podID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerID="65e528d8b0ae9b041234c20273bd32c0fb702d531f2d23967dddb18fbfaea7f8" exitCode=0 Jan 27 00:35:52 crc kubenswrapper[4774]: I0127 00:35:52.389445 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerDied","Data":"65e528d8b0ae9b041234c20273bd32c0fb702d531f2d23967dddb18fbfaea7f8"} Jan 27 00:35:52 crc kubenswrapper[4774]: I0127 00:35:52.390211 4774 scope.go:117] "RemoveContainer" containerID="65e528d8b0ae9b041234c20273bd32c0fb702d531f2d23967dddb18fbfaea7f8" Jan 27 00:36:03 crc kubenswrapper[4774]: I0127 00:36:03.356443 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:03 crc kubenswrapper[4774]: E0127 00:36:03.357558 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:36:12 crc kubenswrapper[4774]: I0127 00:36:12.564220 4774 generic.go:334] "Generic (PLEG): container finished" podID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerID="612cbdd3efc1246a6024e827473fb19920043fe52ca7f85b7878afcf61cf927f" exitCode=0 Jan 27 00:36:12 crc kubenswrapper[4774]: I0127 00:36:12.564284 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerDied","Data":"612cbdd3efc1246a6024e827473fb19920043fe52ca7f85b7878afcf61cf927f"} Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.819134 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986676 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986748 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986790 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986816 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986844 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986918 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.986947 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") pod \"e41bad20-a43c-4561-aefa-15b6cdfb715b\" (UID: \"e41bad20-a43c-4561-aefa-15b6cdfb715b\") " Jan 27 00:36:13 crc kubenswrapper[4774]: I0127 00:36:13.992555 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq" (OuterVolumeSpecName: "kube-api-access-9pjvq") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "kube-api-access-9pjvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.005101 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.007632 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.008333 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.010479 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.013783 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.027075 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "e41bad20-a43c-4561-aefa-15b6cdfb715b" (UID: "e41bad20-a43c-4561-aefa-15b6cdfb715b"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088090 4774 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088123 4774 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-sensubility-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088132 4774 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-healthcheck-log\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088140 4774 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-collectd-config\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088148 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pjvq\" (UniqueName: \"kubernetes.io/projected/e41bad20-a43c-4561-aefa-15b6cdfb715b-kube-api-access-9pjvq\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088157 4774 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.088166 4774 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/e41bad20-a43c-4561-aefa-15b6cdfb715b-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.580405 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" event={"ID":"e41bad20-a43c-4561-aefa-15b6cdfb715b","Type":"ContainerDied","Data":"a4b45d7c5256711b20005d0e8d1e762fe1d54612eb9b855eb186ad80c6d704c0"} Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.580450 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b45d7c5256711b20005d0e8d1e762fe1d54612eb9b855eb186ad80c6d704c0" Jan 27 00:36:14 crc kubenswrapper[4774]: I0127 00:36:14.580537 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-mjtw7" Jan 27 00:36:15 crc kubenswrapper[4774]: I0127 00:36:15.769228 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-mjtw7_e41bad20-a43c-4561-aefa-15b6cdfb715b/smoketest-collectd/0.log" Jan 27 00:36:16 crc kubenswrapper[4774]: I0127 00:36:16.023575 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-mjtw7_e41bad20-a43c-4561-aefa-15b6cdfb715b/smoketest-ceilometer/0.log" Jan 27 00:36:16 crc kubenswrapper[4774]: I0127 00:36:16.293171 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-rjwl8_2480c4b1-c406-412e-9258-a94358f2c1c1/default-interconnect/0.log" Jan 27 00:36:16 crc kubenswrapper[4774]: I0127 00:36:16.551363 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr_221c6d11-7462-4578-bbb1-a78ee6bad7c0/bridge/2.log" Jan 27 00:36:16 crc kubenswrapper[4774]: I0127 00:36:16.797063 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-m57rr_221c6d11-7462-4578-bbb1-a78ee6bad7c0/sg-core/0.log" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.054228 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-564958f777-cn8q8_40a1ab63-3b75-4f20-a692-58d80ccd2847/bridge/2.log" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.303383 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-564958f777-cn8q8_40a1ab63-3b75-4f20-a692-58d80ccd2847/sg-core/0.log" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.356443 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:17 crc kubenswrapper[4774]: E0127 00:36:17.356674 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.551663 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9_d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6/bridge/2.log" Jan 27 00:36:17 crc kubenswrapper[4774]: I0127 00:36:17.796911 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-mlqt9_d82a9cf9-0ba7-416d-93b4-b3bd9bc6a9f6/sg-core/0.log" Jan 27 00:36:18 crc kubenswrapper[4774]: I0127 00:36:18.110736 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp_78320a5f-ee7d-4190-8af3-9d9609bcf111/bridge/2.log" Jan 27 00:36:18 crc kubenswrapper[4774]: I0127 00:36:18.366825 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-94c5d79-m2wqp_78320a5f-ee7d-4190-8af3-9d9609bcf111/sg-core/0.log" Jan 27 00:36:18 crc kubenswrapper[4774]: I0127 00:36:18.642471 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g_cce880cf-c820-49a2-9cf3-c2161499d51f/bridge/2.log" Jan 27 00:36:18 crc kubenswrapper[4774]: I0127 00:36:18.897256 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-5rd8g_cce880cf-c820-49a2-9cf3-c2161499d51f/sg-core/0.log" Jan 27 00:36:22 crc kubenswrapper[4774]: I0127 00:36:22.636071 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-7b4c7b595f-78n7f_61fa30c6-90bc-4b6c-b850-2b2e59506e08/operator/0.log" Jan 27 00:36:22 crc kubenswrapper[4774]: I0127 00:36:22.928143 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_71259922-2e93-4571-80c6-e054f4372056/prometheus/0.log" Jan 27 00:36:23 crc kubenswrapper[4774]: I0127 00:36:23.219662 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_b4f320b3-4c9f-433a-943a-8f2934061b87/elasticsearch/0.log" Jan 27 00:36:23 crc kubenswrapper[4774]: I0127 00:36:23.543364 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-mzmt2_60dce4ee-cdb6-4418-ac73-063f48c8be7e/prometheus-webhook-snmp/0.log" Jan 27 00:36:23 crc kubenswrapper[4774]: I0127 00:36:23.835376 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_695367d2-8c58-4bee-adcc-61c6d1b7457b/alertmanager/0.log" Jan 27 00:36:31 crc kubenswrapper[4774]: I0127 00:36:31.357852 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:31 crc kubenswrapper[4774]: E0127 00:36:31.358961 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:36:40 crc kubenswrapper[4774]: I0127 00:36:40.112788 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-59f5866557-f7gmp_14f2c5e3-3625-4889-97e4-38820ac84518/operator/0.log" Jan 27 00:36:44 crc kubenswrapper[4774]: I0127 00:36:44.679318 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-7b4c7b595f-78n7f_61fa30c6-90bc-4b6c-b850-2b2e59506e08/operator/0.log" Jan 27 00:36:44 crc kubenswrapper[4774]: I0127 00:36:44.969000 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_fe039606-88c5-414d-b20c-a129a4bb0782/qdr/0.log" Jan 27 00:36:46 crc kubenswrapper[4774]: I0127 00:36:46.357580 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:46 crc kubenswrapper[4774]: E0127 00:36:46.358418 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.564889 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:36:50 crc kubenswrapper[4774]: E0127 00:36:50.566905 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-ceilometer" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.566990 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-ceilometer" Jan 27 00:36:50 crc kubenswrapper[4774]: E0127 00:36:50.567066 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-collectd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567124 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-collectd" Jan 27 00:36:50 crc kubenswrapper[4774]: E0127 00:36:50.567191 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b27e0f1-7f20-4c92-b279-4a642379fc41" containerName="curl" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567250 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b27e0f1-7f20-4c92-b279-4a642379fc41" containerName="curl" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567444 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-collectd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567523 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41bad20-a43c-4561-aefa-15b6cdfb715b" containerName="smoketest-ceilometer" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.567585 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b27e0f1-7f20-4c92-b279-4a642379fc41" containerName="curl" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.568557 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.580901 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.619385 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.619469 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.619502 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721009 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721165 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721650 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.721782 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.747378 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") pod \"certified-operators-lz7gd\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:50 crc kubenswrapper[4774]: I0127 00:36:50.887094 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:36:51 crc kubenswrapper[4774]: I0127 00:36:51.168632 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:36:51 crc kubenswrapper[4774]: I0127 00:36:51.902790 4774 generic.go:334] "Generic (PLEG): container finished" podID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerID="2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9" exitCode=0 Jan 27 00:36:51 crc kubenswrapper[4774]: I0127 00:36:51.902899 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerDied","Data":"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9"} Jan 27 00:36:51 crc kubenswrapper[4774]: I0127 00:36:51.902987 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerStarted","Data":"8b7d6c58c285061374502290ab42c6ad81f37ce1c161d4556ef119eed99d9371"} Jan 27 00:36:52 crc kubenswrapper[4774]: I0127 00:36:52.913830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerStarted","Data":"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a"} Jan 27 00:36:53 crc kubenswrapper[4774]: I0127 00:36:53.929752 4774 generic.go:334] "Generic (PLEG): container finished" podID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerID="4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a" exitCode=0 Jan 27 00:36:53 crc kubenswrapper[4774]: I0127 00:36:53.929840 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerDied","Data":"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a"} Jan 27 00:36:54 crc kubenswrapper[4774]: I0127 00:36:54.940192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerStarted","Data":"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40"} Jan 27 00:36:54 crc kubenswrapper[4774]: I0127 00:36:54.964821 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lz7gd" podStartSLOduration=2.50977763 podStartE2EDuration="4.964803386s" podCreationTimestamp="2026-01-27 00:36:50 +0000 UTC" firstStartedPulling="2026-01-27 00:36:51.9043642 +0000 UTC m=+1790.210141084" lastFinishedPulling="2026-01-27 00:36:54.359389926 +0000 UTC m=+1792.665166840" observedRunningTime="2026-01-27 00:36:54.959358261 +0000 UTC m=+1793.265135155" watchObservedRunningTime="2026-01-27 00:36:54.964803386 +0000 UTC m=+1793.270580270" Jan 27 00:36:59 crc kubenswrapper[4774]: I0127 00:36:59.356830 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:36:59 crc kubenswrapper[4774]: E0127 00:36:59.357417 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:37:00 crc kubenswrapper[4774]: I0127 00:37:00.887952 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:00 crc kubenswrapper[4774]: I0127 00:37:00.888444 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:00 crc kubenswrapper[4774]: I0127 00:37:00.974010 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:01 crc kubenswrapper[4774]: I0127 00:37:01.087757 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:02 crc kubenswrapper[4774]: I0127 00:37:02.166355 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.034599 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lz7gd" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="registry-server" containerID="cri-o://468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" gracePeriod=2 Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.532031 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.666227 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") pod \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.666901 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") pod \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.667065 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") pod \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\" (UID: \"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3\") " Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.667799 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities" (OuterVolumeSpecName: "utilities") pod "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" (UID: "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.679014 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26" (OuterVolumeSpecName: "kube-api-access-gbb26") pod "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" (UID: "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3"). InnerVolumeSpecName "kube-api-access-gbb26". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.742378 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" (UID: "6ee72b59-8ea9-4448-abcd-e87f3cb6ead3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.768770 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbb26\" (UniqueName: \"kubernetes.io/projected/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-kube-api-access-gbb26\") on node \"crc\" DevicePath \"\"" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.768824 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:37:03 crc kubenswrapper[4774]: I0127 00:37:03.768843 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.049928 4774 generic.go:334] "Generic (PLEG): container finished" podID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerID="468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" exitCode=0 Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.050046 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerDied","Data":"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40"} Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.050097 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lz7gd" event={"ID":"6ee72b59-8ea9-4448-abcd-e87f3cb6ead3","Type":"ContainerDied","Data":"8b7d6c58c285061374502290ab42c6ad81f37ce1c161d4556ef119eed99d9371"} Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.050135 4774 scope.go:117] "RemoveContainer" containerID="468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.050350 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lz7gd" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.087205 4774 scope.go:117] "RemoveContainer" containerID="4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.119107 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.123308 4774 scope.go:117] "RemoveContainer" containerID="2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.131345 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lz7gd"] Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.153562 4774 scope.go:117] "RemoveContainer" containerID="468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" Jan 27 00:37:04 crc kubenswrapper[4774]: E0127 00:37:04.154473 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40\": container with ID starting with 468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40 not found: ID does not exist" containerID="468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.154583 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40"} err="failed to get container status \"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40\": rpc error: code = NotFound desc = could not find container \"468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40\": container with ID starting with 468e5dfe8a9bcd35620c8f16d015433b946ca9a007f7f7efd6f78199201d6f40 not found: ID does not exist" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.154623 4774 scope.go:117] "RemoveContainer" containerID="4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a" Jan 27 00:37:04 crc kubenswrapper[4774]: E0127 00:37:04.155144 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a\": container with ID starting with 4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a not found: ID does not exist" containerID="4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.155210 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a"} err="failed to get container status \"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a\": rpc error: code = NotFound desc = could not find container \"4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a\": container with ID starting with 4b80f80ddeb5d8633ac108004ff86061a078d757b5fa1aa9267fc66433190a5a not found: ID does not exist" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.155255 4774 scope.go:117] "RemoveContainer" containerID="2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9" Jan 27 00:37:04 crc kubenswrapper[4774]: E0127 00:37:04.155763 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9\": container with ID starting with 2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9 not found: ID does not exist" containerID="2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.155800 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9"} err="failed to get container status \"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9\": rpc error: code = NotFound desc = could not find container \"2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9\": container with ID starting with 2170b10062c9a50f28098724acf591e459f43ae3b32a70008197f934b5a5c6a9 not found: ID does not exist" Jan 27 00:37:04 crc kubenswrapper[4774]: I0127 00:37:04.374983 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" path="/var/lib/kubelet/pods/6ee72b59-8ea9-4448-abcd-e87f3cb6ead3/volumes" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.299237 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:37:11 crc kubenswrapper[4774]: E0127 00:37:11.300144 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="extract-content" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.300165 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="extract-content" Jan 27 00:37:11 crc kubenswrapper[4774]: E0127 00:37:11.300188 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="extract-utilities" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.300198 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="extract-utilities" Jan 27 00:37:11 crc kubenswrapper[4774]: E0127 00:37:11.300221 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="registry-server" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.300233 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="registry-server" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.300403 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee72b59-8ea9-4448-abcd-e87f3cb6ead3" containerName="registry-server" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.301089 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.304034 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6xqfx"/"openshift-service-ca.crt" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.305623 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6xqfx"/"kube-root-ca.crt" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.355996 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.422511 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.422894 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.524756 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.524885 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.525287 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.545714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") pod \"must-gather-qclb8\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:11 crc kubenswrapper[4774]: I0127 00:37:11.646269 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:37:12 crc kubenswrapper[4774]: I0127 00:37:12.166108 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:37:12 crc kubenswrapper[4774]: I0127 00:37:12.366284 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:37:12 crc kubenswrapper[4774]: E0127 00:37:12.366517 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:37:13 crc kubenswrapper[4774]: I0127 00:37:13.133513 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqfx/must-gather-qclb8" event={"ID":"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec","Type":"ContainerStarted","Data":"0020e17fd9488eae2abf19a74cbb871d1c681d3317583ca561ce17e499dde3ee"} Jan 27 00:37:19 crc kubenswrapper[4774]: I0127 00:37:19.192559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqfx/must-gather-qclb8" event={"ID":"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec","Type":"ContainerStarted","Data":"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0"} Jan 27 00:37:19 crc kubenswrapper[4774]: I0127 00:37:19.193343 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqfx/must-gather-qclb8" event={"ID":"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec","Type":"ContainerStarted","Data":"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb"} Jan 27 00:37:19 crc kubenswrapper[4774]: I0127 00:37:19.211296 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6xqfx/must-gather-qclb8" podStartSLOduration=2.28822916 podStartE2EDuration="8.211272004s" podCreationTimestamp="2026-01-27 00:37:11 +0000 UTC" firstStartedPulling="2026-01-27 00:37:12.160792647 +0000 UTC m=+1810.466569531" lastFinishedPulling="2026-01-27 00:37:18.083835491 +0000 UTC m=+1816.389612375" observedRunningTime="2026-01-27 00:37:19.208298404 +0000 UTC m=+1817.514075288" watchObservedRunningTime="2026-01-27 00:37:19.211272004 +0000 UTC m=+1817.517048898" Jan 27 00:37:20 crc kubenswrapper[4774]: I0127 00:37:20.489230 4774 scope.go:117] "RemoveContainer" containerID="14336beb86026bc90d2a9fb6311a3227ae7133bc262a6b935592389475831309" Jan 27 00:37:20 crc kubenswrapper[4774]: I0127 00:37:20.528287 4774 scope.go:117] "RemoveContainer" containerID="7bdfb69e9fa9696e37a7d9b25d175b63e5343122a1e569faca1c08955f8bbb1d" Jan 27 00:37:20 crc kubenswrapper[4774]: I0127 00:37:20.558258 4774 scope.go:117] "RemoveContainer" containerID="89911886b25b8c950e2ce162b903638384fd14c3a7c90ec3cfa3cf3f5c1e0950" Jan 27 00:37:20 crc kubenswrapper[4774]: I0127 00:37:20.605513 4774 scope.go:117] "RemoveContainer" containerID="49b5f230df2b7f05eeb23873ef8a668173f6d60ab2c8dbad4bbe6685a4aa6248" Jan 27 00:37:23 crc kubenswrapper[4774]: I0127 00:37:23.357162 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:37:23 crc kubenswrapper[4774]: E0127 00:37:23.357730 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:37:37 crc kubenswrapper[4774]: I0127 00:37:37.356403 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:37:37 crc kubenswrapper[4774]: E0127 00:37:37.357369 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:37:51 crc kubenswrapper[4774]: I0127 00:37:51.356901 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:37:51 crc kubenswrapper[4774]: E0127 00:37:51.359592 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:38:03 crc kubenswrapper[4774]: I0127 00:38:03.357878 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:38:03 crc kubenswrapper[4774]: E0127 00:38:03.358838 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:38:06 crc kubenswrapper[4774]: I0127 00:38:06.630659 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tjg8b_ec68f29b-2b30-4dff-b875-7617466be51b/control-plane-machine-set-operator/0.log" Jan 27 00:38:06 crc kubenswrapper[4774]: I0127 00:38:06.765762 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kv45n_7cb54417-3e44-4c36-a1cf-438a705f8dcf/kube-rbac-proxy/0.log" Jan 27 00:38:06 crc kubenswrapper[4774]: I0127 00:38:06.766098 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kv45n_7cb54417-3e44-4c36-a1cf-438a705f8dcf/machine-api-operator/0.log" Jan 27 00:38:17 crc kubenswrapper[4774]: I0127 00:38:17.357317 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:38:17 crc kubenswrapper[4774]: E0127 00:38:17.358238 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:38:19 crc kubenswrapper[4774]: I0127 00:38:19.090300 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-9mmc4_0a5e3cb3-d23f-405b-bb88-18159ee24067/cert-manager-controller/0.log" Jan 27 00:38:19 crc kubenswrapper[4774]: I0127 00:38:19.241240 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-hp7q6_84e34f19-48c2-4096-83e7-95b9fd16a1e6/cert-manager-cainjector/0.log" Jan 27 00:38:19 crc kubenswrapper[4774]: I0127 00:38:19.318387 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-gg42r_32c71eef-359e-42d1-a9c8-d0f392ecabaa/cert-manager-webhook/0.log" Jan 27 00:38:28 crc kubenswrapper[4774]: I0127 00:38:28.357175 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:38:28 crc kubenswrapper[4774]: E0127 00:38:28.358249 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2nl9s_openshift-machine-config-operator(3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" Jan 27 00:38:34 crc kubenswrapper[4774]: I0127 00:38:34.942667 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-qzvj5_82ad6e88-a32b-4f4f-9a96-66d10c58a7d9/prometheus-operator/0.log" Jan 27 00:38:35 crc kubenswrapper[4774]: I0127 00:38:35.143690 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6495c554dc-2b75r_dfee63d3-9a5d-46f6-b984-78d6a837e20c/prometheus-operator-admission-webhook/0.log" Jan 27 00:38:35 crc kubenswrapper[4774]: I0127 00:38:35.182913 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6495c554dc-548dl_4ac18775-726a-43da-a184-dfd1565544f1/prometheus-operator-admission-webhook/0.log" Jan 27 00:38:35 crc kubenswrapper[4774]: I0127 00:38:35.334873 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-fqps4_3247d37e-1277-411a-ad8b-ffcd6172206f/operator/0.log" Jan 27 00:38:35 crc kubenswrapper[4774]: I0127 00:38:35.383956 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-b7pt6_96fc7bad-ed57-4110-afa8-9a6e5748c292/perses-operator/0.log" Jan 27 00:38:41 crc kubenswrapper[4774]: I0127 00:38:41.357197 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:38:41 crc kubenswrapper[4774]: I0127 00:38:41.932434 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9"} Jan 27 00:38:51 crc kubenswrapper[4774]: I0127 00:38:51.583015 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/util/0.log" Jan 27 00:38:51 crc kubenswrapper[4774]: I0127 00:38:51.778728 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/util/0.log" Jan 27 00:38:51 crc kubenswrapper[4774]: I0127 00:38:51.812521 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/pull/0.log" Jan 27 00:38:51 crc kubenswrapper[4774]: I0127 00:38:51.842006 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.015693 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/util/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.056010 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/extract/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.056596 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931axgbsq_4bdb298d-0c46-429d-b4c2-44d106881eb7/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.240317 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/util/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.433625 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.450789 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/util/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.476227 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.645999 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/extract/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.755994 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/util/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.815283 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fvhnbb_bcd9000d-51f5-47d1-9fb0-a1177a6c6b06/pull/0.log" Jan 27 00:38:52 crc kubenswrapper[4774]: I0127 00:38:52.867714 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.064571 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.065809 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.123909 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.373441 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.383700 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.384290 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5e7xf2b_1b3f5387-0b7c-4c6d-8aff-e293c038aafb/extract/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.555252 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.717780 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.757189 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.769601 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.960176 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/util/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.965456 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/pull/0.log" Jan 27 00:38:53 crc kubenswrapper[4774]: I0127 00:38:53.977588 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087hzq2_f2086af6-4ed5-4f66-bf01-aa661ba5a168/extract/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.151105 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.378686 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-content/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.389267 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-content/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.392282 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.575406 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.593589 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/extract-content/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.809756 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.966007 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-utilities/0.log" Jan 27 00:38:54 crc kubenswrapper[4774]: I0127 00:38:54.968456 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7bhhh_0b4a869e-2e74-4226-b41c-c8a481a0728b/registry-server/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.005494 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.022553 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.187646 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.215197 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/extract-utilities/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.304261 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bcz6w_fb20bbdd-0a13-4d07-8c4a-8c2285de3173/marketplace-operator/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.497555 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-utilities/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.614435 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6krcw_89dc2f6e-3f9e-4098-b5d4-ff9481de0824/registry-server/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.681196 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.698120 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-utilities/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.706566 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-content/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.888171 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-utilities/0.log" Jan 27 00:38:55 crc kubenswrapper[4774]: I0127 00:38:55.907041 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/extract-content/0.log" Jan 27 00:38:56 crc kubenswrapper[4774]: I0127 00:38:56.441654 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tg7j5_8be8b5b9-e3bc-4236-90ca-3d3808fa39b4/registry-server/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.397934 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-qzvj5_82ad6e88-a32b-4f4f-9a96-66d10c58a7d9/prometheus-operator/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.404259 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6495c554dc-548dl_4ac18775-726a-43da-a184-dfd1565544f1/prometheus-operator-admission-webhook/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.406546 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6495c554dc-2b75r_dfee63d3-9a5d-46f6-b984-78d6a837e20c/prometheus-operator-admission-webhook/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.578871 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-fqps4_3247d37e-1277-411a-ad8b-ffcd6172206f/operator/0.log" Jan 27 00:39:10 crc kubenswrapper[4774]: I0127 00:39:10.622213 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-b7pt6_96fc7bad-ed57-4110-afa8-9a6e5748c292/perses-operator/0.log" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.209779 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.212727 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.221009 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.388719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") pod \"infrawatch-operators-2njqk\" (UID: \"e8e943dd-43b5-4948-b976-abb2c181609b\") " pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.490305 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") pod \"infrawatch-operators-2njqk\" (UID: \"e8e943dd-43b5-4948-b976-abb2c181609b\") " pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.530579 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") pod \"infrawatch-operators-2njqk\" (UID: \"e8e943dd-43b5-4948-b976-abb2c181609b\") " pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:50 crc kubenswrapper[4774]: I0127 00:39:50.830909 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:39:51 crc kubenswrapper[4774]: I0127 00:39:51.146709 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:39:51 crc kubenswrapper[4774]: I0127 00:39:51.582379 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-2njqk" event={"ID":"e8e943dd-43b5-4948-b976-abb2c181609b","Type":"ContainerStarted","Data":"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471"} Jan 27 00:39:51 crc kubenswrapper[4774]: I0127 00:39:51.583095 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-2njqk" event={"ID":"e8e943dd-43b5-4948-b976-abb2c181609b","Type":"ContainerStarted","Data":"77313127db6b9d739b39d1ec129516ee15b3e8ddcd5f636849ae6284149574cf"} Jan 27 00:39:51 crc kubenswrapper[4774]: I0127 00:39:51.621497 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-2njqk" podStartSLOduration=1.494574756 podStartE2EDuration="1.621454591s" podCreationTimestamp="2026-01-27 00:39:50 +0000 UTC" firstStartedPulling="2026-01-27 00:39:51.160767116 +0000 UTC m=+1969.466544010" lastFinishedPulling="2026-01-27 00:39:51.287646971 +0000 UTC m=+1969.593423845" observedRunningTime="2026-01-27 00:39:51.611196005 +0000 UTC m=+1969.916972929" watchObservedRunningTime="2026-01-27 00:39:51.621454591 +0000 UTC m=+1969.927231515" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.609409 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.612548 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.643509 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.667334 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.667402 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.667473 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.768649 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.768741 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.768775 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.769507 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.769930 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.808956 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") pod \"redhat-operators-nc8j2\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:54 crc kubenswrapper[4774]: I0127 00:39:54.947385 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:39:55 crc kubenswrapper[4774]: I0127 00:39:55.180269 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:39:55 crc kubenswrapper[4774]: W0127 00:39:55.188074 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfea77ced_5bd1_430a_81c4_494ff6946525.slice/crio-c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07 WatchSource:0}: Error finding container c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07: Status 404 returned error can't find the container with id c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07 Jan 27 00:39:55 crc kubenswrapper[4774]: I0127 00:39:55.617946 4774 generic.go:334] "Generic (PLEG): container finished" podID="fea77ced-5bd1-430a-81c4-494ff6946525" containerID="8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37" exitCode=0 Jan 27 00:39:55 crc kubenswrapper[4774]: I0127 00:39:55.618016 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerDied","Data":"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37"} Jan 27 00:39:55 crc kubenswrapper[4774]: I0127 00:39:55.618406 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerStarted","Data":"c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07"} Jan 27 00:39:56 crc kubenswrapper[4774]: I0127 00:39:56.629712 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerStarted","Data":"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2"} Jan 27 00:39:57 crc kubenswrapper[4774]: I0127 00:39:57.641548 4774 generic.go:334] "Generic (PLEG): container finished" podID="fea77ced-5bd1-430a-81c4-494ff6946525" containerID="4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2" exitCode=0 Jan 27 00:39:57 crc kubenswrapper[4774]: I0127 00:39:57.641646 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerDied","Data":"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2"} Jan 27 00:39:57 crc kubenswrapper[4774]: I0127 00:39:57.644365 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 00:39:58 crc kubenswrapper[4774]: I0127 00:39:58.654331 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerStarted","Data":"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b"} Jan 27 00:39:58 crc kubenswrapper[4774]: I0127 00:39:58.688036 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nc8j2" podStartSLOduration=2.253131854 podStartE2EDuration="4.68800349s" podCreationTimestamp="2026-01-27 00:39:54 +0000 UTC" firstStartedPulling="2026-01-27 00:39:55.621399897 +0000 UTC m=+1973.927176781" lastFinishedPulling="2026-01-27 00:39:58.056271533 +0000 UTC m=+1976.362048417" observedRunningTime="2026-01-27 00:39:58.683201791 +0000 UTC m=+1976.988978745" watchObservedRunningTime="2026-01-27 00:39:58.68800349 +0000 UTC m=+1976.993780434" Jan 27 00:40:00 crc kubenswrapper[4774]: I0127 00:40:00.831387 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:00 crc kubenswrapper[4774]: I0127 00:40:00.831485 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:00 crc kubenswrapper[4774]: I0127 00:40:00.876929 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:01 crc kubenswrapper[4774]: I0127 00:40:01.757763 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:03 crc kubenswrapper[4774]: I0127 00:40:03.596842 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:40:03 crc kubenswrapper[4774]: I0127 00:40:03.712634 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-2njqk" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" containerName="registry-server" containerID="cri-o://9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" gracePeriod=2 Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.124599 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.176582 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") pod \"e8e943dd-43b5-4948-b976-abb2c181609b\" (UID: \"e8e943dd-43b5-4948-b976-abb2c181609b\") " Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.185304 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76" (OuterVolumeSpecName: "kube-api-access-qxx76") pod "e8e943dd-43b5-4948-b976-abb2c181609b" (UID: "e8e943dd-43b5-4948-b976-abb2c181609b"). InnerVolumeSpecName "kube-api-access-qxx76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.279344 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxx76\" (UniqueName: \"kubernetes.io/projected/e8e943dd-43b5-4948-b976-abb2c181609b-kube-api-access-qxx76\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.729230 4774 generic.go:334] "Generic (PLEG): container finished" podID="e8e943dd-43b5-4948-b976-abb2c181609b" containerID="9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" exitCode=0 Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.729354 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-2njqk" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.729393 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-2njqk" event={"ID":"e8e943dd-43b5-4948-b976-abb2c181609b","Type":"ContainerDied","Data":"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471"} Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.731298 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-2njqk" event={"ID":"e8e943dd-43b5-4948-b976-abb2c181609b","Type":"ContainerDied","Data":"77313127db6b9d739b39d1ec129516ee15b3e8ddcd5f636849ae6284149574cf"} Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.731416 4774 scope.go:117] "RemoveContainer" containerID="9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.758786 4774 scope.go:117] "RemoveContainer" containerID="9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" Jan 27 00:40:04 crc kubenswrapper[4774]: E0127 00:40:04.759454 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471\": container with ID starting with 9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471 not found: ID does not exist" containerID="9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.759544 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471"} err="failed to get container status \"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471\": rpc error: code = NotFound desc = could not find container \"9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471\": container with ID starting with 9869f376da68c37defd08345af053fbf73b9cbbf4e4afd913a7808d8dc69d471 not found: ID does not exist" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.762487 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.769233 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-2njqk"] Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.948257 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:04 crc kubenswrapper[4774]: I0127 00:40:04.948653 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:06 crc kubenswrapper[4774]: I0127 00:40:06.014646 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nc8j2" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" probeResult="failure" output=< Jan 27 00:40:06 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Jan 27 00:40:06 crc kubenswrapper[4774]: > Jan 27 00:40:06 crc kubenswrapper[4774]: I0127 00:40:06.369464 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" path="/var/lib/kubelet/pods/e8e943dd-43b5-4948-b976-abb2c181609b/volumes" Jan 27 00:40:07 crc kubenswrapper[4774]: I0127 00:40:07.769271 4774 generic.go:334] "Generic (PLEG): container finished" podID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" exitCode=0 Jan 27 00:40:07 crc kubenswrapper[4774]: I0127 00:40:07.769422 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqfx/must-gather-qclb8" event={"ID":"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec","Type":"ContainerDied","Data":"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb"} Jan 27 00:40:07 crc kubenswrapper[4774]: I0127 00:40:07.770421 4774 scope.go:117] "RemoveContainer" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" Jan 27 00:40:07 crc kubenswrapper[4774]: I0127 00:40:07.931349 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6xqfx_must-gather-qclb8_0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec/gather/0.log" Jan 27 00:40:14 crc kubenswrapper[4774]: I0127 00:40:14.941949 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:40:14 crc kubenswrapper[4774]: I0127 00:40:14.943104 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6xqfx/must-gather-qclb8" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="copy" containerID="cri-o://4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" gracePeriod=2 Jan 27 00:40:14 crc kubenswrapper[4774]: I0127 00:40:14.949846 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6xqfx/must-gather-qclb8"] Jan 27 00:40:14 crc kubenswrapper[4774]: I0127 00:40:14.999511 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.078291 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.330414 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6xqfx_must-gather-qclb8_0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec/copy/0.log" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.331172 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.416596 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") pod \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.416832 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") pod \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\" (UID: \"0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec\") " Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.428083 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7" (OuterVolumeSpecName: "kube-api-access-d48h7") pod "0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" (UID: "0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec"). InnerVolumeSpecName "kube-api-access-d48h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.488353 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" (UID: "0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.519090 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d48h7\" (UniqueName: \"kubernetes.io/projected/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-kube-api-access-d48h7\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.519149 4774 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.601081 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.857493 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6xqfx_must-gather-qclb8_0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec/copy/0.log" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.858483 4774 generic.go:334] "Generic (PLEG): container finished" podID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerID="4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" exitCode=143 Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.858597 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqfx/must-gather-qclb8" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.858675 4774 scope.go:117] "RemoveContainer" containerID="4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.890276 4774 scope.go:117] "RemoveContainer" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.958167 4774 scope.go:117] "RemoveContainer" containerID="4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" Jan 27 00:40:15 crc kubenswrapper[4774]: E0127 00:40:15.958809 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0\": container with ID starting with 4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0 not found: ID does not exist" containerID="4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.958878 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0"} err="failed to get container status \"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0\": rpc error: code = NotFound desc = could not find container \"4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0\": container with ID starting with 4758978dabc850d8c31c2539087df7830467afc2ced618437119c751146171a0 not found: ID does not exist" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.958922 4774 scope.go:117] "RemoveContainer" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" Jan 27 00:40:15 crc kubenswrapper[4774]: E0127 00:40:15.959471 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb\": container with ID starting with 18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb not found: ID does not exist" containerID="18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb" Jan 27 00:40:15 crc kubenswrapper[4774]: I0127 00:40:15.959544 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb"} err="failed to get container status \"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb\": rpc error: code = NotFound desc = could not find container \"18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb\": container with ID starting with 18f075127f58d616573b86732e33467a236b4355f7f674dfc18a8b3a93c385cb not found: ID does not exist" Jan 27 00:40:16 crc kubenswrapper[4774]: I0127 00:40:16.373562 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" path="/var/lib/kubelet/pods/0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec/volumes" Jan 27 00:40:16 crc kubenswrapper[4774]: I0127 00:40:16.866679 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nc8j2" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" containerID="cri-o://7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" gracePeriod=2 Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.326097 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.363605 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") pod \"fea77ced-5bd1-430a-81c4-494ff6946525\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.363772 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") pod \"fea77ced-5bd1-430a-81c4-494ff6946525\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.363832 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") pod \"fea77ced-5bd1-430a-81c4-494ff6946525\" (UID: \"fea77ced-5bd1-430a-81c4-494ff6946525\") " Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.366821 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities" (OuterVolumeSpecName: "utilities") pod "fea77ced-5bd1-430a-81c4-494ff6946525" (UID: "fea77ced-5bd1-430a-81c4-494ff6946525"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.380090 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn" (OuterVolumeSpecName: "kube-api-access-fk2cn") pod "fea77ced-5bd1-430a-81c4-494ff6946525" (UID: "fea77ced-5bd1-430a-81c4-494ff6946525"). InnerVolumeSpecName "kube-api-access-fk2cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.465820 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.466008 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk2cn\" (UniqueName: \"kubernetes.io/projected/fea77ced-5bd1-430a-81c4-494ff6946525-kube-api-access-fk2cn\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.524372 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fea77ced-5bd1-430a-81c4-494ff6946525" (UID: "fea77ced-5bd1-430a-81c4-494ff6946525"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.567145 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fea77ced-5bd1-430a-81c4-494ff6946525-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.883822 4774 generic.go:334] "Generic (PLEG): container finished" podID="fea77ced-5bd1-430a-81c4-494ff6946525" containerID="7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" exitCode=0 Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.883974 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerDied","Data":"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b"} Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.884050 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8j2" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.884115 4774 scope.go:117] "RemoveContainer" containerID="7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.884087 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8j2" event={"ID":"fea77ced-5bd1-430a-81c4-494ff6946525","Type":"ContainerDied","Data":"c15104766ad4eb5276f13fe7561352bf070cbb370cee9773ec55b6e473cc6d07"} Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.921082 4774 scope.go:117] "RemoveContainer" containerID="4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2" Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.948065 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.962718 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nc8j2"] Jan 27 00:40:17 crc kubenswrapper[4774]: I0127 00:40:17.969658 4774 scope.go:117] "RemoveContainer" containerID="8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.017641 4774 scope.go:117] "RemoveContainer" containerID="7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" Jan 27 00:40:18 crc kubenswrapper[4774]: E0127 00:40:18.018697 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b\": container with ID starting with 7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b not found: ID does not exist" containerID="7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.018770 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b"} err="failed to get container status \"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b\": rpc error: code = NotFound desc = could not find container \"7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b\": container with ID starting with 7ca2af3ea24f56fcf9f9590582bfd0221402cccf79ddd1647ff278639633430b not found: ID does not exist" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.018818 4774 scope.go:117] "RemoveContainer" containerID="4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2" Jan 27 00:40:18 crc kubenswrapper[4774]: E0127 00:40:18.019455 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2\": container with ID starting with 4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2 not found: ID does not exist" containerID="4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.019496 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2"} err="failed to get container status \"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2\": rpc error: code = NotFound desc = could not find container \"4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2\": container with ID starting with 4c4ba98b837f5b01a8041120594812405d13e5d05e302fd28b1680dfc39d86c2 not found: ID does not exist" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.019540 4774 scope.go:117] "RemoveContainer" containerID="8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37" Jan 27 00:40:18 crc kubenswrapper[4774]: E0127 00:40:18.019917 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37\": container with ID starting with 8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37 not found: ID does not exist" containerID="8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.019970 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37"} err="failed to get container status \"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37\": rpc error: code = NotFound desc = could not find container \"8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37\": container with ID starting with 8ea565d616d4ea509618678a3a834bef4bf6ed9859ccef611f704e912009ee37 not found: ID does not exist" Jan 27 00:40:18 crc kubenswrapper[4774]: I0127 00:40:18.372921 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" path="/var/lib/kubelet/pods/fea77ced-5bd1-430a-81c4-494ff6946525/volumes" Jan 27 00:41:06 crc kubenswrapper[4774]: I0127 00:41:06.675769 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:41:06 crc kubenswrapper[4774]: I0127 00:41:06.676799 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:41:36 crc kubenswrapper[4774]: I0127 00:41:36.675613 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:41:36 crc kubenswrapper[4774]: I0127 00:41:36.676805 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.675643 4774 patch_prober.go:28] interesting pod/machine-config-daemon-2nl9s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.678056 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.678308 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.679161 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9"} pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 00:42:06 crc kubenswrapper[4774]: I0127 00:42:06.679312 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" podUID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerName="machine-config-daemon" containerID="cri-o://ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9" gracePeriod=600 Jan 27 00:42:07 crc kubenswrapper[4774]: I0127 00:42:07.072414 4774 generic.go:334] "Generic (PLEG): container finished" podID="3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a" containerID="ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9" exitCode=0 Jan 27 00:42:07 crc kubenswrapper[4774]: I0127 00:42:07.072842 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerDied","Data":"ea1f7f1507e8ab30d775c393877ef9ca6aab35f20f5448926e8f50f33e3e67f9"} Jan 27 00:42:07 crc kubenswrapper[4774]: I0127 00:42:07.072924 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2nl9s" event={"ID":"3605b94c-c171-4ff3-a3c9-d8e6a7cf7a9a","Type":"ContainerStarted","Data":"9542a14f7f81cbd726fa24778812835add869f0c4df1484cb45f72957fe0636b"} Jan 27 00:42:07 crc kubenswrapper[4774]: I0127 00:42:07.072955 4774 scope.go:117] "RemoveContainer" containerID="d2c155647b2b81294834842ae650926cdcd394a986ee1e8ee3c98797880e6541" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.194944 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197194 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="extract-content" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197226 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="extract-content" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197253 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197266 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197293 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="gather" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197307 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="gather" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197325 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197338 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197362 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="copy" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197373 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="copy" Jan 27 00:42:48 crc kubenswrapper[4774]: E0127 00:42:48.197397 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="extract-utilities" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197411 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="extract-utilities" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197687 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="copy" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197720 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e943dd-43b5-4948-b976-abb2c181609b" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197740 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd9f455-03a8-4a6c-8f5b-ed6ab7ee9dec" containerName="gather" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.197771 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea77ced-5bd1-430a-81c4-494ff6946525" containerName="registry-server" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.199724 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.202350 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.316696 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.316781 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.316815 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.419841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.419989 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.420030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.420612 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.421528 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.450683 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") pod \"community-operators-q86gp\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.534365 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:48 crc kubenswrapper[4774]: I0127 00:42:48.918942 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:42:49 crc kubenswrapper[4774]: I0127 00:42:49.552294 4774 generic.go:334] "Generic (PLEG): container finished" podID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" containerID="23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322" exitCode=0 Jan 27 00:42:49 crc kubenswrapper[4774]: I0127 00:42:49.552417 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerDied","Data":"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322"} Jan 27 00:42:49 crc kubenswrapper[4774]: I0127 00:42:49.552459 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerStarted","Data":"d8bdf409fa5b9682622fe9503e85e6d67518d56a44cf740ce575fc7a5663caa7"} Jan 27 00:42:51 crc kubenswrapper[4774]: I0127 00:42:51.595167 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerDied","Data":"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc"} Jan 27 00:42:51 crc kubenswrapper[4774]: I0127 00:42:51.595192 4774 generic.go:334] "Generic (PLEG): container finished" podID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" containerID="bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc" exitCode=0 Jan 27 00:42:52 crc kubenswrapper[4774]: I0127 00:42:52.614666 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerStarted","Data":"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f"} Jan 27 00:42:52 crc kubenswrapper[4774]: I0127 00:42:52.659951 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q86gp" podStartSLOduration=2.192490775 podStartE2EDuration="4.659913335s" podCreationTimestamp="2026-01-27 00:42:48 +0000 UTC" firstStartedPulling="2026-01-27 00:42:49.557576743 +0000 UTC m=+2147.863353647" lastFinishedPulling="2026-01-27 00:42:52.024999323 +0000 UTC m=+2150.330776207" observedRunningTime="2026-01-27 00:42:52.64560051 +0000 UTC m=+2150.951377474" watchObservedRunningTime="2026-01-27 00:42:52.659913335 +0000 UTC m=+2150.965690249" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.535943 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.536750 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.625968 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.743701 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:42:58 crc kubenswrapper[4774]: I0127 00:42:58.883994 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:43:00 crc kubenswrapper[4774]: I0127 00:43:00.694277 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q86gp" podUID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" containerName="registry-server" containerID="cri-o://1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" gracePeriod=2 Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.233036 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.285541 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") pod \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.286837 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities" (OuterVolumeSpecName: "utilities") pod "f342913b-d5a0-45a5-a4ec-c48a7d9adb39" (UID: "f342913b-d5a0-45a5-a4ec-c48a7d9adb39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.293983 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") pod \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.294113 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") pod \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\" (UID: \"f342913b-d5a0-45a5-a4ec-c48a7d9adb39\") " Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.294936 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.308260 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22" (OuterVolumeSpecName: "kube-api-access-dvr22") pod "f342913b-d5a0-45a5-a4ec-c48a7d9adb39" (UID: "f342913b-d5a0-45a5-a4ec-c48a7d9adb39"). InnerVolumeSpecName "kube-api-access-dvr22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.396504 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvr22\" (UniqueName: \"kubernetes.io/projected/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-kube-api-access-dvr22\") on node \"crc\" DevicePath \"\"" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.414234 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f342913b-d5a0-45a5-a4ec-c48a7d9adb39" (UID: "f342913b-d5a0-45a5-a4ec-c48a7d9adb39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.499287 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f342913b-d5a0-45a5-a4ec-c48a7d9adb39-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.708661 4774 generic.go:334] "Generic (PLEG): container finished" podID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" containerID="1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" exitCode=0 Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.708766 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerDied","Data":"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f"} Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.709212 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q86gp" event={"ID":"f342913b-d5a0-45a5-a4ec-c48a7d9adb39","Type":"ContainerDied","Data":"d8bdf409fa5b9682622fe9503e85e6d67518d56a44cf740ce575fc7a5663caa7"} Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.709250 4774 scope.go:117] "RemoveContainer" containerID="1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.708884 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q86gp" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.760110 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.765813 4774 scope.go:117] "RemoveContainer" containerID="bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.767175 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q86gp"] Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.798809 4774 scope.go:117] "RemoveContainer" containerID="23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.833384 4774 scope.go:117] "RemoveContainer" containerID="1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" Jan 27 00:43:01 crc kubenswrapper[4774]: E0127 00:43:01.834554 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f\": container with ID starting with 1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f not found: ID does not exist" containerID="1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.834630 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f"} err="failed to get container status \"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f\": rpc error: code = NotFound desc = could not find container \"1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f\": container with ID starting with 1dbe12d11fe58b50ee8ef166a6acf430e480d13d3ef313805a43a77e2a69f20f not found: ID does not exist" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.834684 4774 scope.go:117] "RemoveContainer" containerID="bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc" Jan 27 00:43:01 crc kubenswrapper[4774]: E0127 00:43:01.835446 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc\": container with ID starting with bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc not found: ID does not exist" containerID="bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.835487 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc"} err="failed to get container status \"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc\": rpc error: code = NotFound desc = could not find container \"bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc\": container with ID starting with bde8648ce2ad098410300eefca9e9228f2ebfab670f17a3f09a20baacb52a1fc not found: ID does not exist" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.835517 4774 scope.go:117] "RemoveContainer" containerID="23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322" Jan 27 00:43:01 crc kubenswrapper[4774]: E0127 00:43:01.836005 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322\": container with ID starting with 23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322 not found: ID does not exist" containerID="23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322" Jan 27 00:43:01 crc kubenswrapper[4774]: I0127 00:43:01.836258 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322"} err="failed to get container status \"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322\": rpc error: code = NotFound desc = could not find container \"23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322\": container with ID starting with 23197e91cd2079be880ec28251726c752aab2e40eef62979ecd39a36efe4a322 not found: ID does not exist" Jan 27 00:43:02 crc kubenswrapper[4774]: I0127 00:43:02.370434 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f342913b-d5a0-45a5-a4ec-c48a7d9adb39" path="/var/lib/kubelet/pods/f342913b-d5a0-45a5-a4ec-c48a7d9adb39/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515136005104024440 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015136005104017355 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015136000320016473 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015136000320015443 5ustar corecore